Context Holders: Realizing Multiple Layer Activation Mechanisms in a Single Context-Oriented Language (bibtex)
by Tomoyuki Aotani, Tetsuo Kamina and Hidehiko Masuhara
Abstract:
We propose LamFJ, a calculus for expressing various layer activation mechanisms in context-oriented programming languages. LamFJ extends FeatherweightJava with context holders, which are the abstraction of dynamic layer activation. By encoding programs with different layer activation mechanisms into a program manipulating context holders, LamFJ serves as a foundation to reason about interactions between different mechanisms. This paper presents a sketch of the context holders and encodings of existing layer activation mechanisms.
Reference:
Context Holders: Realizing Multiple Layer Activation Mechanisms in a Single Context-Oriented Language (Tomoyuki Aotani, Tetsuo Kamina and Hidehiko Masuhara), In Proceedings of the Workshop on Foundations of Aspect-Oriented Languages (FOAL'14) (Eric Bodden, ed.), 2014.
Bibtex Entry:
@inproceedings{aotani2014foal,
  url = {http://www.cs.ucf.edu/~leavens/FOAL/index-2014.shtml},
  location = {Lugano Switzerland},
  month = apr,
  editor = {Eric Bodden},
  year = 2014,
  booktitle = {Proceedings of the Workshop on Foundations of Aspect-Oriented Languages (FOAL'14)},
  pdf = {foal2014.pdf},
  author = {Tomoyuki Aotani and Tetsuo Kamina and Hidehiko Masuhara},
  title = {Context Holders: Realizing Multiple Layer Activation
Mechanisms in a Single Context-Oriented Language},
  pages = {3--6},
  doi = {10.1145/2588548.2588552},
  abstract = {We propose LamFJ, a calculus for expressing various layer activation mechanisms in context-oriented programming languages. LamFJ extends FeatherweightJava with context holders, which are the abstraction of dynamic layer activation. By encoding programs with different layer activation mechanisms into a program manipulating context holders, LamFJ serves as a foundation to reason about interactions between different mechanisms. This paper presents a sketch of the context holders and encodings of existing layer activation mechanisms.}
}
Powered by bibtexbrowser