Intelligent Search: The AI That Finds What You Mean
Traditional search fails when you forget the name. Our new local AI understands your intent.
In the tech world right now, "AI" is a buzzword that gets slapped onto everything—often where it's not even needed. At Luma, we're careful about jumping on bandwagons. We only add features that solve real problems.
Recently, we realized there was a gap in how we find our saved content.
We all do it: we bookmark a page, thinking "I'll definitely need this for my project later," and then… we forget about it. Six months later, you remember the concept of the page, but not the title, and definitely not the URL.
If you try to search for it using traditional keywords (fuzzy search), you're stuck. If you don't remember the exact words, you won't find the bookmark.
This is where AI is actually necessary.
Introducing Intelligent Search
To solve this, we've built an Intelligent Agent for searching.
Unlike standard search which looks for matching letters, our AI search understands the meaning behind your query. You can describe what you're looking for in natural language, and the AI will find related items even if they don't share a single keyword.
For example, you can search for "coding tutorials for beginners" and it might find your bookmarks labeled "React 101" or "Python Basics". It connects the dots between what you say and what you mean.
Privacy First: Always Local
One of our core principles at Luma is privacy. We believe your data belongs to you.
That's why our AI search runs entirely locally in your browser. We don't send your search queries or your bookmark data to any external server for processing.
We achieve this using the Xenova/all-MiniLM-L6-v2 model. It's a powerful, efficient machine learning model that we've optimized to run directly on your device. Specifically, we use the 8-bit quantized version to ensure it's lightweight and fast without sacrificing accuracy.
This means you get the power of semantic search without ever compromising your privacy.
The Best of Both Worlds: Hybrid Search
We didn't just stop at AI. We believe that for a search to be truly "intelligent," it needs to handle every scenario.
That's why we built a Hybrid Search Engine.
While our AI understands meaning, sometimes you do just want to find a file by its exact name, or you might make a typo. For these cases, we use FlexSearch, a super-fast full-text search library.
FlexSearch handles the "fuzzy" side of things—catching typos and matching exact keywords instantly—while our AI handling the "semantic" side, understanding what you actually mean. By combining these two technologies, Luma gives you the most comprehensive search experience possible.
What's Next?
We're just getting started with local intelligence. We are closely monitoring the development of Gemini Nano, Google's most efficient model built for on-device tasks.
Once Gemini Nano is available globally for Chromium browsers, we plan to integrate it into Luma to bring even smarter capabilities to your bookmarks.
For now, we're sticking with our current optimized model because it delivers the speed and accuracy you need today. Give it a try—just hit search and describe what you're looking for.
