develooper Front page | perl.perl5.porters | Postings from July 2012

Re: supporting untarring of extensions (was Re: Detecting duplicateextension directories)

Thread Previous | Thread Next
Nicholas Clark
July 11, 2012 13:49
Re: supporting untarring of extensions (was Re: Detecting duplicateextension directories)
Message ID:
On Wed, Jul 11, 2012 at 02:51:29PM -0500, Jesse Luehrs wrote:
> On Wed, Jul 11, 2012 at 08:23:09PM +0100, Nicholas Clark wrote:

> > Other than Configure probes, what is causing pain for dual life modules
> > sitting in the existing build process? We must fix that, I agree.
> > But I'm not aware of the specifics of the problems, which makes it hard
> > to help.
> > 
> > (This stuff is hard to get working at all, as it has to work on 3 disjoint
> > families of platforms, which make the differences between Linux and AIX seem
> > tiny.)
> The problems I'm referring to have to do with the way we currently ship
> dual-life modules. For instance: the that ships with core
> doesn't include the code to monkey-patch UNIVERSAL::VERSION, because it
> knows that since it's part of core, UNIVERSAL::VERSION already works the
> way it wants it to. The version on CPAN *does* include this code, even
> though it claims to be the same version number. This sort of thing is
> not uncommon among the modules that we ship with core.

You're never going to fix with this, because parts of it are
hard coded into util.c. Sorry.

(But if it *is* possible to kick parts of util.c out into XS code, then
it may well be possible to fix how we handle without changing the
build system at all)

Which other modules are pain?

> The issue here becomes that every exception like this makes it harder to
> update core versions of modules when a new CPAN release is made. Sure, a
> lot of this can be automated away (Porting/sync-with-cpan is incredibly
> helpful here), but there is still a lot of manual intervention needed to
> ensure that the automation works (because it will always be nothing more
> than a guess), and there are various parts of it that aren't automatable
> at all.

Seriously, please can you start listing them. If can't build a
CPAN tarball unpacked verbatim to the same structure into a directory down
in cpan/ then that needs fixing, and I will try hard to fix it. If there's
a reason why we can't unpack to the same structure as the CPAN distribution,
that also needs fixing.

And I'm having trouble seeing why some other tool(chain) is going to be able
to get this right if can't. It's using all the same ExtUtils
modules as the CPAN toolchain.

> Moving the process for building dual-life modules toward "just drop the
> same tarball you uploaded to CPAN into this directory" would be a
> *great* help in simplifying a lot of the tedium involved.

Right. But the code that unpacks those tarballs IS DUAL LIFE.

a) there's a bootstrapping problem here.
b) the only significant difference you seem to describe is that one starts from
   a *packed* tarball, instead of an unpacked tarball.

So what am I missing here?

> Additionally, this would allow external packagers to have a much easier
> time building a stripped-down version of perl for whatever reason (as
> Fedora and whoever else seem interested in doing). All you would (well,
> should) have to do would be to remove the appropriate tarballs from the
> right place, and they would be gone. No need to fiddle with the build
> system at all.

Yes. I'd love this to be possible. And in the original message I suggested
a route that should permit this.

But right now, what you're describing would *also* seem to be possible using
an external driver script that

a) configures, builds and installs perl
b) fires up
c) gets it to build things

(and doesn't *need* to be in the core at all)

Which means that we seem to be discussing two possibly disjoint problems

1) that the existing handling of dual life modules has a reputation for being

2) that we would also like to be able to handle building arbitrary
   collections of extra tarballs in *an* integrated process

So part of my questioning now would be "what does the core do wrong that
stops an external driver script being useful?" Because it's going to be
the same things that need solving whether such a driver is internal or

I'm sorry if I'm sounding like an inquisition. I appreciate that there *is*
pain, but I'm really failing to grasp any specific details of where that
pain is coming from.

> > configure, build, (test) and install really need to stay as 4 distinct
> > phases.
> "Install" doesn't necessarily mean "install system-wide". We already
> build miniperl, and then use it to build the rest of the extensions, so
> building an executable and then running it isn't itself inherently an
> issue. I don't see why we couldn't just install the stripped down (but
> full) version of perl into a separate directory within the build
> directory itself (similar to how home-directory installations currently
> work), use it during the build process, and then remove it at the end of
> the build phase.

How does this differ from the current situation, where everything *is*
(effectively) installed in lib/ during the build process?

I guess, "because the paths are wrong".

But that isn't going to be solved unless one builds (I guess) at least
a perl binary and that reflect the intermediate location.
So one then would have to build some chunk of things twice - once for the
temporary install location, and once again for the final real location.

And if it's a build location private to "us", there's no need to remove it.

I'm not exactly sure what you're envisaging here, or how it differs
currently from what we have once miniperl and all the non-XS extensions
are built. Do we have any part of the later build process that needs XS
extensions? (Or is hurting because it doesn't have them?)

There is potentially a very simple solution to that - a third perl binary
between miniperl and perl. A sort of mesoperl (hybrid), which *does* have
DynaLoader (so can run XS code) but also has the miniperl buildcustomize
logic (so has a way to fix @INC to be absolute paths into the build

Nicholas Clark

Thread Previous | Thread Next Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at | Group listing | About