* Father Chrysostomos <sprout@cpan.org> [2012-01-03T13:01:42] > Ricardo Signes wrote: > > C<defined> does not really impose scalar context as usually understood. For > > example, defined(&foo) does not call foo in scalar context. It is its own > > thing, which is, basically, the problem. > > This is a syntactic distinction. defined(&foo), defined(@foo) and > defined(%foo) are special cases. All others (should) work this way (bugs > aside): My point is more that it has already been made a weird snowflake. I do want defined to create a scalar context, but right now, it's weirder. -- rjbsThread Previous | Thread Next