demerphq wrote: > There are some internal API's related to globnames and package names > that will cause Perl to segfault if you feed them broken utf8 marked > as utf8. > > This was discovered by AFL fuzzing of Sereal. Since these are internal > API's it is arguable its the callers responsibility to ensure the utf8 > is valid. But its also arguable that perl should choke/refuse such > strings anyway. > > I dont care either way, I am just curious if people think this is > bug-report worthy. If so I will file a more detailed ticket. If not > maybe ill file a doc patch for the functions I know about. > > Anybody care to express an opinion? > > Yves > There is a limit to how paranoid you can be. You can run your server processes inside valgrind 100% of the time for security sake, or use ECC RAM. In XS code, there is no security if the C code doesn't want it. mmap and mprotect myself a chunk of executable memory, put whatever machine code I want, and take over the computer. I vote for docs, not validating a utf8 string half a dozen times per hash key fetch or delete. Perl C API doesn't really know if args come from XS modules, or from PP op tree op funcs, it is too complicated to try to keep track if the utf8 string came from PP or 3rd party XS code. One way is to use PERL_CORE define to redefine all utf8 aware Perl API C funcs to utf8 sanitizing (not only for continuation bytes, but code point ranges (surrogate CPs in utf8) and normalization too, how about banning PUA CPs in hash keys/packages?) versions? Bulk88 would be the first to -DPERL_CORE for CPAN libs if that happened. As a serializer, you are taking in foreign hostile data, you have to make sure it can't command things that aren't allowed (interpolate user submitted data into a string then eval the string, or use a property value to dispatch a method call on an object and someone figures out the names of the undocumented in AJAX/HTML other methods).Thread Previous