On 12/24/20 3:31 PM, Ricardo Signes wrote: > Once upon a time, a sponsor provided a server called dromedary. It was > a big beefy hunky of hardware with a lot of RAM and a bunch of cores. > It compiled perl very fast. Some people used it for bisecting, some for > testing a bunch of different versions of perl, and so on. > > Later, a second dromedary existed and was used, but less. > Is the current dromedary #2 or #3? I got out of the habit of using dromedary during the period after #1 disappeared. Your message prompted me to time some testing on the new dromedary as compared to my current, 6-year-old laptop. ##### buildblead make test_harness laptop 3m22.110s 7m12.494s dromedary 1m34.417s 3m48.966s All times 'real' from time(1). TEST_JOBS=4 on both machines. ##### So I suspect that when raw speed is the concern, I'll be spending a lot more time on dromedary! But, as is suggested by Karl's noting that the old dromedary had 700+ locales, the old dromedary was *administered*. My impression is that the new dromedary is just a VM someone is paying for. For the QA-oriented work that I do, we would have to have new C compilers installed as they come along. We would also have to have CPAN modules installed to do things like recording build-time warnings. > I've been trying to get a handle on the various resources we rely on, > and dromedary has come up. If there's to be a third iteration of this > box: *Who will use it and for what?* > If you're thinking of *all* our resources and not just dromedary, I would add that our smoke-testing resources are, for me, absolutely essential. The death of perl5.test-smoke.org for three months essentially halted my Perl 5 work for that duration. Even now, the smoke-test runs I submit there are not being added to Tony's aggregator at http://perl.develop-help.com. And I'll bet anything that the new perl5.test-smoke.org, like the old one, is not being backed up, which means we're at just as much risk as before of losing years of data. So please include those two sites in your thinking about our dev resources. Thank you very much. Jim KeenanThread Previous | Thread Next