An end-to-end, multimodal vector search engine. Store and query unstructured data such as text, images, and code through a single easy-to-use API.
With Marqo, input pre-processing, machine learning inference, and storage are all included out of the box.
Run Marqo in a docker image on your laptop or scale it up to dozens of GPU inference nodes in the cloud. Marqo can be scaled to provide low latency searches against multi-terabyte indexes.
Marqo helps you configure deep-learning models like CLIP to pull semantic meaning from images. You can seamlessly search any combination of text and images and even combine text and images into a single vector.
Marqo has you covered from inference to storage. With Marqo you don't need to calculate the vectors yourself, simply select the model you want to use and pass the text and/or image URLs directly to the API.
Vector search allows you to go beyond keyword matching and lets you search based on the meaning of text, images and other unstructured data.
Be a part of our community and help us revolutionise the future of search. Join the discussion whether you are a contributor, a user, or simply have questions about Marqo.
Jumpstart your search experience in a couple of minutes. Marqo has libraries available in a number of languages.
On top of our zero-configuration deployment with Docker, you can choose to tailor settings to suit the nature of your deployment and search experience.
Marqo cloud provides you with a managed service so you can focus on building your core product. The cloud gives you access to an easy to use console where you can manage your resources, users, and billing. Join the cloud today!
Marqo provides a feature rich developer experience - it lets you perform multimodal vector search without having to sacrifice on features like filtering and highlighting.
Marqo can be used with text and/or image data. Multimodal indexes seamlessly handle image-to-image, image-to-text and text-to-image search.
Marqo supports weighted queries which can combine multiple text and image queries together. Negative weights can be added to query terms to push certain items out of your result set.
Marqo supports score modifiers, numeric fields in your documents can be used to manipulate the score and influence the ranking of results.
Additional context can be added to queries by providing vectors directly, this helps tailor results without the overhead of additional inference.
Import open source models from Hugging Face, bring your own, or load private models from AWS S3 or Hugging Face using authentication.
Parts of Marqo's API support bulk operations to improve throughput. These bulk operations enable use cases such as bulk changes to multiple indexes or coalescing of queries.
Marqo is horizontally scalable and can be run at million document scale whilst maintaining lightning-fast search times.
Marqo provides search highlighting functionality which allows you to transparently understand where and why a match occurred.
Marqo offers a powerful query DSL (Domain Specific Language), which can be applied as a prefilter ahead of approximate k-NN search.
Marqo was made by developers, for developers. We work with contributors to help shape the development experience.
Get the assistance you need, here and now.
Marqo is accessible to everyone, down to the very core.
Marqo's integration with Docker allows you to deploy on any cloud provider you prefer. Be it AWS, GCP, Azure or Marqo Cloud.
Get involved with building the future of Marqo or access support from Marqo's vast community by joining us on Slack and Github.
Discuss issues, PRs, and what features to add. Vote on your favorite features. Let us know!
Join the group discussion and let any head-scratchers be answered by the community.
Have a look at Marqo's release notes to keep up-to-date with development.