Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Updated python and wolfram examples

...

Textstyles
Colorrgb(236,0,140)

DRAFT SPEC |Services are in active development and are subject to change Services not yet live


The Charity Engine Smart Proxy Post Processing (SP3) interface allows web data that is collected via Smart Proxy scripts to be processed by any application or execution environment available on or supported by the Charity Engine network. This processing takes place on the same node that collects the data.

...

To submit data for local processing on the proxy node, include reference the autoloaded "charityengine" module in a Node.js smart proxy script, and call the desired processing function . See the Charity Engine Application Library for details on the specific functionality of applications or execution environments(*see Section 2 below for definitions of specific functions).

Processing functions return a Promise object, which resolves with the response from the model or application. By using await, this response can then be used as appropriate within the Smart Proxy script and can be included in the crawl output.

For example, to generate embeddings for text collected  collected using the Llama3.2 model:

Code Block
languagetext
charityengine.embeddings('llama3.2:3b', 'What is the weather')
  .then(response => console.log('Embedding response:', response))
  .catch(error => console.error('Error:', error));textCollectedFromCrawl);

To run a Docker container to calculate a mathematical sum using a custom script named "calc-sum.js"To generate a chi-squared distribution with v degrees of freedom using Wolfram Engine:

Code Block
languagetext
charityengine.wolframenginedocker('ChiSquareDistribution[v]')
  .then(response => console.log('Response:', response))
  .catch(error => console.error('Error:', error));node:slim', 'node /local/input/calc-sum.js 1 2 3 4.2', 'calc-sum.js');

To include multiple outputs, the responses from crawls and functions can be included in an object and returned from the Smart Proxy script, as the return value defines the job output.

Processing Functions

A variety of post-processing functions are available based on applications that are compatible with the Charity Engine network. 

...

Docker Applications

Any image that can be pulled from Docker Hub, or that is publicly accessible on the web, can be used for post-processing by calling the docker() function (*subject to specs of instance types used):

Code Block
languagejs
charityengine.docker(image, commandline, inputfile)
PARAMETERS
  image // Name URL or Docker Hub name of the Docker image to run [string] [required]
  commandline // Command to execute within the container [string] [required]
  inputfile // Names of local files to use as input [array of string]

...

Execution Environments

The default execution environment is Node.js, per Section 1 above, "General Usage". For running source code or scripts in other interpreted languages, just download the execution environment of your choice from Docker Hub, following the instructions in Section 2.1 above. For example:

Code Block
languagejs
charityengine.docker("python:slim", "python /local/input/hello-world.py", ["hello-world.py"])

Charity Engine Application Library

Functions are also available for built-in applications that have been deployed to the Charity Engine network.

Note that some of these functions make use of proprietary software, which could incur additional runtime charges.

inference

Run LLM inference on a string

Code Block
languagejs
charityengine.vina(commandline, inputfileinference(model, prompt, assets, context, system, template, options)
PARAMETERS
  commandlinemodel // Name Commandof linethe formodel theto applicationuse [string] [required]
  inputfileprompt // NamesText ofto localpass filesto tothe usemodel as input [string] [required]
  assets // Images or files [array of string]

blastp

Code Block
languagejs
charityengine.blastp(commandline, inputfile)
PARAMETERS
  commandlinecontext // CommandAdditional linecontext for the applicationmodel [string] [required]
  inputfilesystem // NamesSystem ofparameters localfor files to use as input [array of string]

wolframengine

Code Block
languagejs
charityengine.wolframengine(commandline, inputfile)
PARAMETERSthe model [string]
  commandlinetemplate // CommandTemplate to executeguide withinresponse the applicationformat [string] [required]
  inputfileoptions // NamesAdditional ofoptions localfor files to use as input [array of string]

...

the request [object]

embeddings

Generate a vector embedding for a string

Code Block
languagejs
charityengine.inferenceembeddings(model, prompt, assets, context, system, template, options)
PARAMETERS
  model // Name of the model to use [string] [required]
  prompt // Text to pass to the model as input [string] [required]

wolframengine

Run Wolfram Language Code

Code Block
languagejs
charityengine.wolframengine(commandline, inputfile)
PARAMETERS
  assetscommandline // ImagesCommand to orexecute fileswithin [arraythe ofapplication [string] [required]
  contextinputfile // Names of local files to use as input [array of string]

For example, to generate a chi-squared distribution with 22 degrees of freedom using Wolfram Engine:

Code Block
languagetext
charityengine.wolframengine('ChiSquareDistribution[22]');

vina

Run a protein-ligand binding simulation

Code Block
languagejs
charityengine.vina(commandline, inputfile)
PARAMETERS
  commandline // Command line Additional context for the model [string]
  system // System parameters for the modelapplication [string] [required]
  templateinputfile // TemplateNames toof guidelocal responsefiles format [string]
  options // Additional options for the request [object]

...

to use as input [array of string]

blastp

Compare an amino acid sequence (protein sequence) against a protein sequence database

Code Block
languagejs
charityengine.embeddingsblastp(modelcommandline, promptinputfile)
PARAMETERS
  modelcommandline // Command Nameline offor the modelapplication to use [string] [required]
  promptinputfile // TextNames of tolocal passfiles to the modeluse as input [string] [required]array of string]


See the Charity Engine Application Library for further details on the specific functionality of applications or execution environments.