Build cross-modal and multi-modal applications on the cloud
Jina is a framework that empowers anyone to build cross-modal and multi-modal[*] applications on the cloud. It uplifts a PoC into a production-ready service. Jina handles the infrastructure complexity, making advanced solution engineering and cloud-native technologies accessible to every developer.
Applications built with Jina enjoy the following features out-of-the-box:
pip install jina
Document, Executor and Flow are three fundamental concepts in Jina.
Leveraging these three concepts, let's look at a simple example below:
from jina import DocumentArray, Executor, Flow, requests class MyExec(Executor): @requests async def add_text(self, docs: DocumentArray, **kwargs): for d in docs: d.text += 'hello, world!' f = Flow().add(uses=MyExec).add(uses=MyExec) with f: r = f.post('/', DocumentArray.empty(2)) print(r.texts)
MyExecdefines an async function
DocumentArrayfrom network requests and appends
fdefines a Flow streamlined two Executors in a chain;
withblock opens the Flow, sends an empty DocumentArray to the Flow, and prints the result.
Running it gives you:
At the last line we see its output
['hello, world!hello, world!', 'hello, world!hello, world!'].
While one could use standard Python with the same number of lines and get the same output, Jina accelerates time to market of your application by making it more scalable and cloud-native. Jina also handles the infrastructure complexity in production and other Day-2 operations so that you can focus on the data application itself.
The example above can be refactored into a Python Executor file and a Flow YAML file:
Run the following command in the terminal:
jina flow --uses toy.yml
The server is successfully started, and you can now use a client to query it.
from jina import Client, Document c = Client(host='grpc://0.0.0.0:51000') c.post('/', Document())
This simple refactoring allows developers to write an application in the client-server style. The separation of Flow YAML and Executor Python file does not only make the project more maintainable but also brings scalability and concurrency to the next level:
needsin YAML/Python. Load-balancing is automatically added when necessary to ensure the maximum throughput.
Without having to worry about dependencies, you can easily share your Executors with others; or use public/private Executors in your project thanks to Jina Hub.
To create an Executor:
jina hub new
To push it to Jina Hub:
jina hub push .
To use a Hub Executor in your Flow:
Behind this smooth experience is advanced management of Executors:
Using Kubernetes becomes easy:
jina export kubernetes flow.yml ./my-k8s kubectl apply -R -f my-k8s
Using Docker Compose becomes easy:
jina export docker-compose flow.yml docker-compose.yml docker-compose up
Using Prometheus becomes easy:
from jina import Executor, requests, DocumentArray class MyExec(Executor): @requests def encode(self, docs: DocumentArray, **kwargs): with self.monitor('preprocessing_seconds', 'Time preprocessing the requests'): docs.tensors = preprocessing(docs) with self.monitor( 'model_inference_seconds', 'Time doing inference the requests' ): docs.embedding = model_inference(docs.tensors)
Using Grafana becomes easy, just download this JSON and import it into Grafana:
What cloud-native technology is still challenging to you? Tell us, we will handle the complexity and make it easy for you.