>>Perhaps, except that we're talking interpreter warnings, not compiler >>warnings. The compiler knows about three-part for loops; the >>interpreter does not. Likewise, the compiler knows about string >>and array and slice interpolation; the interpreter does not. >Then we need to either: >A) Post in big bold letters someplace that internal optimizations may make >the errors they get at runtime a little odd >or >B) Attach enough hint info to the optree to emit errors correct for the >original program text >A's easier, B's better but more expensive both in compile time and porter >time. If the optimizer gets seriously attacked, though, it might be worth it. There are many many cases of this that could be "attacked". Essentially, what people are suddenly expecting is that the compiler insert complete source code information during compilation so that during run time, the interpreter can say things. Kind of like cc -g. This is potentially prohibitively expensive, though, as you point out. It's not like all cc compiles run with -g, and for good reason. Choice 1: Remove the warning and many others. Choice 2: Educate the users. This is inevitable. Things just are not what they were. This is a compiler. That's what compilers do. Ever tried debugging highly optimized code from a DEC compiler? Whew! Choice 3: Redesign the compiler and interpreter to store and access complete source code. The bigger issue, if you're going to be bloating the optree with full source information, is to deal with $x + $y and not knowing which variable was wicked. It doesn't say Undefined value in lexical $x defined at line 23 of Foo.pm Undefined value in global $Foo::y at line 23 of Foo.pm --tom