Front page | perl.perl6.users |
Postings from September 2022
Re: Using Inline::Python
Thread Previous
|
Thread Next
From:
Sean McAfee
Date:
September 9, 2022 18:44
Subject:
Re: Using Inline::Python
Message ID:
CANan03bzXTCSphyWC=sY6WTEogdbKR-9sgGN5vyRyQFQaGPcpw@mail.gmail.com
I still see the same behavior. That is, if I do say spark.sql('select
1+1'); twice, the first time prints an empty array, and the second produces
the "instance has no attribute 'sql'" error message.
I forgot to mention, this is on a somewhat old version of Raku, 2021.04.
But then, Inline::Python seems to be mostly even older.
On Fri, Sep 9, 2022 at 11:35 AM Elizabeth Mattijsen <liz@dijkmat.nl> wrote:
> To rule out any REPL artefacts, do you see the same thing if you put the
> code in a script and run the script?
>
> > On 9 Sep 2022, at 20:17, Sean McAfee <eefacm@gmail.com> wrote:
> >
> > Hello--
> >
> > I recently started playing around with PySpark. It soon occurred to me
> that it would be a lot more fun to work in Raku instead of Python, and I
> recalled that it's supposed to be possible to get handles to Python objects
> from Raku and call methods on them seamlessly, so I tried to make it
> happen. I got pretty far, but now I'm stymied. Here are the steps I can
> take in the Raku interpreter:
> >
> > > use Inline::Python
> > Nil
> > > my \python = Inline::Python.new
> > Inline::Python.new
> >
> > Self-explanatory.
> >
> > > python.run('from pyspark.sql import SparkSession')
> > (Any)
> >
> > No errors, that looks promising...
> >
> > > my \spark = python.run('SparkSession.builder.getOrCreate()', :eval)
> > ... spam from initialization of Spark session deleted...
> > Inline::Python::PythonObject.new(ptr =>
> NativeCall::Types::Pointer.new(4461193984), python => Inline::Python.new)
> >
> > Now we're getting somewhere! (I had to source-dive to guess that I
> needed that :eval; without it, an Any is returned.)
> >
> > > my \sql = spark.sql('select 1+1')
> > []
> >
> > Uh...what? I was expecting to get another Python object back, a
> DataFrame. (I think; I'm much more familiar with the Scala interface to
> Spark.) Instead I have an empty array.
> >
> > Even more puzzlingly, if I re-run that last statement, I get an error:
> "instance has no attribute 'sql'". If I re-run the statement over and
> over, the response alternates between an empty array and that error.
> >
> > Does anyone have any insight into what's going on?
> >
>
>
Thread Previous
|
Thread Next