@@ -6,4 +6,240 @@ label-text: New in 2.10
66hidden : true
77---
88
9- Macros stub.
9+ Since the 2.10.0 Scala includes macros that can be enabled
10+ with ` import language.experimental.macros ` on per-file basis
11+ or with ` -language:experimental.macros ` on per-compilation basis.
12+
13+ Macros significantly simplify code analysis and code generation, which makes them a tool of choice for
14+ a multitude of [[ http://scalamacros.org/usecases/index.html real-world use cases]] .
15+ Scenarios that traditionally involve writing and maintaining boilerplate can be addressed
16+ with macros in concise and maintainable way.
17+
18+ ### Writing macros
19+
20+ This documentation page explains a type-safe ` printf ` macro through an end-to-end example.
21+ To follow the example, create a file named <code >Macros.scala</code > and paste the following
22+ code (be sure to follow the comments in the code, they reveal important things to know about
23+ the macro system, accompanying APIs and infrastructure):
24+
25+ import scala.reflect.macros.Context
26+ import collection.mutable.ListBuffer
27+ import collection.mutable.Stack
28+
29+ object Macros{
30+ // macro definition is a normal function with almost no restrictions on its signature
31+ // its body, though, is nothing more that a reference to an implementation
32+ def printf(format: String, params: Any*): Unit = macro impl
33+
34+ // macro implementation must correspond to macro definitions that use it
35+ // required signature is quite involved, but the compiler knows what it wants
36+ // should a mismatch occur, it will print the expected signature in the error message
37+ def impl(c: Context)(format: c.Expr[String], params: c.Expr[Any]*): c.Expr[Unit] ={
38+ // STEP I: compiler API is exposed in scala.reflect.macros.Context
39+ // its most important part, reflection API, is accessible via c.universe
40+ // it's customary to import c.universe._
41+ // because it includes a lot of routinely used types and functions
42+ import c.universe._
43+
44+ // STEP II: the macro starts with parsing the provided format string
45+ // macros run during the compile-time, so they operate on trees, not on values
46+ // this means that the format parameter of the macro will be a compile-time literal
47+ // not an object of type java.lang.String
48+ // this also means that the code below won't work for printf("%d" + "%d", ...)
49+ // because in that case format won't be a string literal
50+ // but rather an abstract syntax that represents addition of two string literals
51+ val Literal(Constant(s_format: String)) = format.tree
52+
53+ // STEP IIIa: after parsing the format string, the macro needs to generate the code
54+ // that will partially perform formatting at compile-time
55+ // the paragraph below creates temporary vals that precompute the arguments
56+ // to learn about dynamic generation of Scala code, follow the documentation
57+ // on trees, available in the scala.reflect.api package
58+ val evals = ListBuffer[ValDef]()
59+ def precompute(value: Tree, tpe: Type): Ident ={
60+ val freshName = newTermName(c.fresh("eval$"))
61+ evals += ValDef(Modifiers(), freshName, TypeTree(tpe), value)
62+ Ident(freshName)
63+ }
64+
65+ // STEP IIIb: tree manipulations proceed in this code snippet
66+ // the snippet extracts trees from parameters of a macro and transforms them
67+ // note the use of typeOf to create Scala types corresponding to forma specifiers
68+ // information on types can be found in the docs for the scala.reflect.api package
69+ val paramsStack = Stack[Tree]((params map (_.tree)): _*)
70+ val refs = s_format.split("(?<=%[\\w%])|(?=%[\\w%])") map{
71+ case "%d" => precompute(paramsStack.pop, typeOf[Int])
72+ case "%s" => precompute(paramsStack.pop, typeOf[String])
73+ case "%%" => Literal(Constant("%"))
74+ case part => Literal(Constant(part))
75+ }
76+
77+ // STEP IV: the code that has been generated is now combined into a Block
78+ // note the call to reify, which provides a shortcut for creating ASTs
79+ // reify is discussed in details in docs for scala.reflect.api.Universe
80+ val stats = evals ++ refs.map(ref => reify(print(c.Expr[Any](ref).splice)).tree)
81+ c.Expr[Unit](Block(stats.toList, Literal(Constant(()))))
82+ }
83+ }
84+
85+
86+ To summarize the code provided above, macros are mini-compiler plugins that get executed whenever a compiler
87+ comes across an invocation of a method declared as a macro. Such methods, dubbed ''macro definitions'', use
88+ the ` macro ` keyword to reference ''macro implementations''.
89+
90+ Macro implementations use [[ scala.reflect.api.package reflection API]] to communicate
91+ with the compiler. The gateway to that API is ` c ` , a ubiquitous parameter of type [[ scala.reflect.macros.Context]]
92+ that must be present in all macro implementations. Compiler universe is available through ` c.universe ` .
93+
94+ Input arguments to a macro implementation are [[ scala.reflect.api.Trees abstract syntax trees]] , which
95+ correspond to the arguments of the method invocation that triggered a macro expansion. These trees
96+ are wrapped in [[ scala.reflect.api.Exprs exprs]] , typed wrappers over trees.
97+
98+ The end result produced by a macro implementation is an abstract syntax tree
99+ wrapped in an expr. This tree represents the code that the compiler will use
100+ to replace the original method invocation.
101+ To learn more about how to create trees that correspond to given Scala code and how to perform
102+ tree manipulations, visit [[ scala.reflect.api.Trees the documentation page on trees]] .
103+
104+ === Compiling macros ===
105+
106+ In 2.10.0 macros are an experimental feature, so they need to be enabled before use.
107+ Normal compilation of the snippet written above (using ` scalac Macros.scala ` ) fails as follows:
108+
109+ C:/Projects/Kepler/sandbox>scalac Macros.scala
110+ Macros.scala:8: error: macro definition needs to be enabled
111+ by making the implicit value language.experimental.macros visible.
112+ This can be achieved by adding the import clause 'import language.experimental.macros'
113+ or by setting the compiler option -language:experimental.macros.
114+ See the Scala docs for value scala.language.experimental.macros for a discussion
115+ why the feature needs to be explicitly enabled.
116+ def printf(format: String, params: Any* ): Unit = macro printf_impl
117+ ^
118+ one error found
119+
120+
121+ To enable macros one should use either ` import language.experimental.macros ` on per-file basis
122+ or ` -language:experimental.macros ` (providing a compiler switch) on per-compilation basis.
123+
124+
125+ C:/Projects/Kepler/sandbox>scalac -language:experimental.macros Macros.scala
126+ <scalac has exited with code 0>
127+
128+ === Using macros ===
129+
130+ Create a file named <code >Test.scala</code > and paste the following code (just as simple as that,
131+ to use a macro, it's only necessary to import it and call it as it were a regular function).
132+
133+ object Test extends App{
134+ import Macros._
135+ printf("hello %s!", "world")
136+ }
137+
138+
139+ An important rule about using macros is separate compilation. To perform macro expansion, compiler
140+ needs a macro implementation in executable form. Thus macro implementations need to be compiled before
141+ the main compilation, otherwise compiler will produce ` macro implementation not found ` errors.
142+
143+ In the REPL, however, macros and their usages can be written in the same session. That's because
144+ the REPL compiles every line of input in a separate compilation run.
145+
146+
147+ C:/Projects/Kepler/sandbox>scalac Test.scala
148+ <scalac has exited with code 0>
149+
150+ C:/Projects/Kepler/sandbox>scala Test
151+ hello world!
152+
153+
154+ The test snippet seems to work! To see what happens under the covers, enable the ` -Ymacro-debug-lite ` compiler flag.
155+
156+ C:/Projects/Kepler/sandbox>scalac -Ymacro-debug-lite Test.scala
157+ typechecking macro expansion Macros.printf("hello %s!", "world") at
158+ source-C:/Projects/Kepler/sandbox\Test.scala,line-3,offset=52
159+ {
160+ val eval$1: String = "world"
161+ scala.this.Predef.print("hello ");
162+ scala.this.Predef.print(eval$1);
163+ scala.this.Predef.print("!");
164+ ()
165+ }
166+ Block(List(
167+ ValDef(Modifiers(), newTermName("eval$1"), TypeTree(String), Literal(Constant("world"))),
168+ Apply(
169+ Select(Select(This(newTypeName("scala")), newTermName("Predef")), newTermName("print")),
170+ List(Literal(Constant("hello")))),
171+ Apply(
172+ Select(Select(This(newTypeName("scala")), newTermName("Predef")), newTermName("print")),
173+ List(Ident(newTermName("eval$1")))),
174+ Apply(
175+ Select(Select(This(newTypeName("scala")), newTermName("Predef")), newTermName("print")),
176+ List(Literal(Constant("!"))))),
177+ Literal(Constant(())))
178+
179+
180+ With ` -Ymacro-debug-lite ` one can see both pseudo-Scala representation of the code generated by macro expansion
181+ and raw AST representation of the expansion. Both have their merits: the former is useful for surface analysis,
182+ while the latter is invaluable for fine-grained debugging.
183+
184+ === Writing bigger macros ===
185+
186+ When the code of a macro implementation grows big enough to warrant modularization beyond the body
187+ of the implementation method, it becomes apparent that one needs to carry around the context parameter,
188+ because most things of interest are path-dependent on the context.
189+
190+ One of the approaches is to write a class that takes a parameter of type ` Context ` and then split the
191+ macro implementation into a series of methods of that class. This is natural and simple, except that
192+ it's hard to get it right. Here's a typical compilation error.
193+
194+ scala> class Helper(val c: Context){
195+ | def generate: c.Tree = ???
196+ | }
197+ defined class Helper
198+
199+ scala> def impl(c: Context): c.Expr[Unit] ={
200+ | val helper = new Helper(c)
201+ | c.Expr(helper.generate)
202+ | }
203+ <console>:32: error: type mismatch;
204+ found : helper.c.Tree
205+ (which expands to) helper.c.universe.Tree
206+ required: c.Tree
207+ (which expands to) c.universe.Tree
208+ c.Expr(helper.generate)
209+ ^
210+
211+ The problem in this snippet is in a path-dependent type mismatch. The Scala compiler
212+ does not understand that ` c ` in ` impl ` is the same object as ` c ` in ` Helper ` , even though the helper
213+ is constructed using the original ` c ` .
214+
215+ Luckily just a small nudge is all that is needed for the compiler to figure out what's going on.
216+ One of the possible ways of doing that is using refinement types (the example below is the simplest
217+ application of the idea; for example, one could also write an implicit conversion from ` Context `
218+ to ` Helper ` to avoid explicit instantiations and simplify the calls).
219+
220+ scala> abstract class Helper{
221+ | val c: Context
222+ | def generate: c.Tree = ???
223+ | }
224+ defined class Helper
225+
226+ scala> def impl(c1: Context): c1.Expr[Unit] ={
227+ | val helper = new{val c: c1.type = c1 } with Helper
228+ | c1.Expr(helper.generate)
229+ | }
230+ impl: (c1: scala.reflect.macros.Context)c1.Expr[Unit]
231+
232+ An alternative approach is to use the [[ scala.Singleton]] upper bound to express the fact
233+ that ` Helper ` 's ` C ` has the same identity as ` impl ` 's ` C ` (note that it is mandatory to
234+ explicitly spell out the type argument when instantiating ` Helper ` ).
235+
236+ scala> class Helper[C <: Context with Singleton](val c: C){
237+ | def generate: c.Tree = ???
238+ | }
239+ defined class Helper
240+
241+ scala> def impl(c: Context): c.Expr[Unit] ={
242+ | val helper = new Helper[c.type](c)
243+ | c.Expr(helper.generate)
244+ | }
245+ impl: (c: scala.reflect.macros.Context)c.Expr[Unit]
0 commit comments