-
Calling/using a Python function or Class from Q
Posted by simon_watson_sj on July 12, 2023 at 12:00 amHey Team,
Trying to get my head around pyKX and wondered – is it possible to import a python function or class into Q and then pass it arguments to get a response?
I have been using EmbedPy to call TensorFlow or PyTorch models. This is great when you have a good GPU because you can set up a load balancer and then use a queue of ‘jobs’ to run multiple models simultaneously across available worker processes. When a process finishes one model, it just pulls another one off the queue and gets to work.
Looking at the docs, it appears that the use of PyKx to use python from Q (rather than accessing Q from python) requires an ‘Insights’ license so it not suitable for home use. Is that correct?
Simon
simon_watson_sj replied 8 months, 3 weeks ago 2 Members · 6 Replies -
6 Replies
-
pykx.q
provides this functionality. Most EmbedPy code can be migrated by replacing.p
with.pykx
https://code.kx.com/pykx/1.6/api/pykx_under_q.html
The needed file is included when you
pip install pykx
To make it available to your q processes run the commend to copy the needed files:
python -c "import pykx as kx; kx.install_into_QHOME()"
Then you can use as you describe:
l pykx.q np:.pykx.import`numpy np[`:arange][10]` 0 1 2 3 4 5 6 7 8 9
There are still some improvements needed to reach parity/maturity which will be coming in the next few releases of PyKX.
License flags are needed to use this feature but they are now provided to all downloads.
You can check your license in q
q)`insights.lib.embedq`insights.lib.pykx in: `$" " vs .z.l 4 11b
If you do not have the flags you just need to re download and a new license with the flags will be included.
https://kx.com/kdb-personal-edition-download/https://kx.com/kdb-insights-personal-edition-license-download/
-
Thanks so much for this – sorry for the delay in response. This does look an exciting new development and it’s great for KDB/Q – As well as enabling the integration of a vast library of useful stuff into Q (with, as I understand it some work under the hood to help with speed), let’s hope pyKX becomes a gateway drug to get Python users hooked in Vector programming and KDB/Q! Simon
-
Updating that link to download with PyKX flags must be done using the insights link specifically: https://kx.com/kdb-insights-personal-edition-license-download/
-
Updating that link to download with PyKX flags must be done using the insights link specifically:
https://kx.com/kdb-insights-personal-edition-license-download/
-
Thanks – I have actually downloaded the linux and docker version of this but just need to get a quiet afternoon to set it up. (and I do have the additional license). There are a couple of things of interest to me with the ‘insights approach’ that are hazy – in my current setup I use a collection of 3 functions with a load balancer to allow me to throw a query to be processed into elements that can be farmed off and processed by other q processes and I then process the inbound results via .z.po. To set up my load-balancer and q processes I have a bash script that takes flags to determine the precise config. Updating that, I imagine I just swap out each q staring expression for one docker insights startup and ensure they are all mapped to the same file hierarchy?That means if I had say 10 q processes previously, now I have 10 insight docker containers running (not as bad as it sounds – maybe perfect use case for docker in fact). Also, to run a Developer instance, would I just treat the q prompt generated from a docker insights process as a standard q prompt and just use the standard developer expression in as usual? so: alias developer=’source /home/userName/developer/config/config.profile; q /home/userName/developer/launcher.q_ ‘ Finally, although I am drawn to the contained nature of the dockerised solution over the Linux install (I’m on Ubuntu) but my concern is the loss of processing capability. As I understand it the usual ‘docker penalty’ is about 10% of processing speed over a ‘bare metal’ deployment. Once I get that afternoon, I will let you know how I go. Simon
-
Thanks – I have actually downloaded the linux and docker version of this but just need to get a quiet afternoon to set it up. (and I do have the additional license).
There are a couple of things of interest to me with the ‘insights approach’ that are hazy – in my current setup I use a collection of 3 functions with a load balancer to allow me to throw a query to be processed into elements that can be farmed off and processed by other q processes and I then process the inbound results via .z.po. To set up my load-balancer and q processes I have a bash script that takes flags to determine the precise config. Updating that, I imagine I just swap out each q staring expression for one docker insights startup and ensure they are all mapped to the same file hierarchy?That means if I had say 10 q processes previously, now I have 10 insight docker containers running (not as bad as it sounds – maybe perfect use case for docker in fact).
Also, to run a Developer instance, would I just treat the q prompt generated from a docker insights process as a standard q prompt and just use the standard developer expression in as usual?
so:alias developer='source /home/userName/developer/config/config.profile; q /home/userName/developer/launcher.q_ '
Finally, although I am drawn to the contained nature of the dockerised solution over the Linux install (I’m on Ubuntu) but my concern is the loss of processing capability. As I understand it the usual ‘docker penalty’ is about 10% of processing speed over a ‘bare metal’ deployment.
Once I get that afternoon, I will let you know how I go.
Simon
Log in to reply.