A Blog About Programming, Search & User Experience

Try out our new experimental version of HN Search

Exactly a year ago, we began to power the Hacker News search engine (see our blog post). Since then, our HN search project has grown a lot, expanding from 20M to 25M indexed items, and serving from 900K to 30M searches a month. In addition to hn.algolia.com we’re also providing the HN Search API: a lot of you have used it to build various readers or monitor tools and we love the applications you’re building on top of us. The community was also pretty active on GitHub, requesting improvements and catching bugs… keep on contributing!

Eating our own dog food on HN search

We are power users of Hacker News and there isn’t a single day we don’t use it. Being able to use our own engine on a tool that is so important to us has been a unique opportunity to eat our own dog food. We’ve added a lot of API features during the year but unfortunately didn’t have the time to refresh the UI so far.

One of our 2015 resolutions was to push the envelope of the HN search UI/UX:
  • make it more readable,
  • more usable,
  • and use modern frontend frameworks.

That’s what motivated us to release a new experimental version of HN Search. Try it out and tell us what you think!

Applying more UI best practices

We’ve learned a lot of things from the comments of the users of the previous version. We also took a look at all the cool apps built on top of our API. We wanted to apply more UI best practices and here is what we ended with:

Focus on instantaneity

The whole layout has been designed to provide an instant experience, reducing the wait time before the actual content is displayed. It’s also a way to reduce the number of mouse clicks needed to access and navigate through the content. The danger with that kind of structure can be to end up with a flickering UI where each user action redraw the page, activating unwanted behaviors and consuming a huge amount of memory.We focused on a smooth experience. Some of the techniques used are based on basic performance optimizations but in the end what really matters for us is the user’s perception of latency between each interactions, more than objective performance. Here are some of the tricks we applied:

  • Toggle comments: we wanted the user to be able to read all the comments of a story on the same page, our API on top of Firebase allowed us to load and display them with a single call.
  • Sticky posts: in some cases we are loading up to 500 comments, we wanted the user to be able to keep the information of what he is reading and easily collapse it, so we decided to keep the initial post on top of the list.
  • Lazy-loading of non-cached images: when you are refreshing the UI for each request you don’t want every thumbnail to flick on the UI when loading. So we applied a simple fade to avoid that. But there is actually no way to know if an image is already loaded or not from a previous query. We manage to detect that with a small timeout.
  • Loading feedback: the most important part of a reactive UI is to always give the user a feedback on the state of the UI. We choose to add this information with a thin loading bar on top of the page.
  • Deferring the load of some unnecessary elements: this one is about performance. When you are displaying about 20 repeatable items on each keypress you want them as light as possible. In our case we are using Angular.js with some directives which were too slow to render. So we ended up rendering them only if the user interact with them.
  • Cache every requests: It’s mainly about the backspace key. When a user want to modify his query by removing some characters you don’t want to make him wait for the result: that’s cached by the Algolia JS API client.

Focus on readability

We’ve learned a lot from your comments while releasing our first HN Search version last year. Readability of the search results must be outstanding to allow you to quickly understand why the results are retrieved and what they are about. We ended up with 2 gray colors and 2 font weights to ease the readability without being too distracting.

Stay as minimal as possible

If you see unnecessary stuffs, please tell us. We are not looking for the most ‘minimal’ UI but for the right balance between usability and minimalism.

Sorting & Filtering improvements

Most HN Search users are advanced users. They know exactly what they are searching for and want to have the ability to sort and filter their results precisely. We are now exposing a simple way to either sort results by date or popularity in addition to the period filtering capabilities we already had.

Inlined comments

We thought it could make a lot of sense to be able to read the comments of a story directly from the search result page. Keeping in mind it should be super readable, we went for indentations & author colored avatars making it really clear to understand who is replying.

Search settings

Because HN Search users are advanced users, they want to be able to customize the way the default ranking is working. So be it, we’ve just exposed a subset of the underlying settings we’re using for the search to let you customize it.

Front page

Since Firebase is providing the official API of Hacker News, fetching the items currently displayed on the front page is really easy. We decided to pair it with our search, allowing users to search for hot stories & comments through a discreet menu item.

Starred

Let’s go further; what about being able to star some stories to be able to search in them later? You’re now able to star any stories directly from the results page. The stars are stored locally in your browser for now. Let us know if you find the feature valuable!

Contribution

As you may know, the whole source code of the HN Search website is open-source and hosted on GitHub. This new version is still based on a Rails 4 project and uses Angular.js as the frontend framework. We’ve improved the README to help you being able to contribute in minutes. Not to mention: we love pull-requests.
Now is starting again the most important part of this project, user testing. We count on you to bring us the necessary information to make this search your favorite one.

Wanna test?

To try it, go to our experimental version of HN Search, go to “Settings”, and enable the new style:

 

 

Want to contribute?

It’s open-source and we’ll be happy to get your feedback! Just use GitHub’s issues to report any idea you have in mind. We also love pull-requests :)

Source code: https://github.com/algolia/hn-search

Try it now!

Christmas GiftHub: Awesome Autocomplete

illus-christmas
By working every day on building the best search engine, we’ve become obsessed with our own search experience on the websites and mobile applications we use.
We’re git addicts and love using GitHub to store every single idea or project we work on. We use it both for our private and public repositories (12 API clientsHN Search or various d e m o s). We use every day its search function and we decided to re-build it the way we thought it should be We‘re proud to share it with the community via this Chrome extension. Our Github Awesome Autocomplete enables a seamless and fast access to GitHub resources via an as-you-type search functionality.

Install your Christmas Gift now!

github-awesome-autocomplete_440x280

Github Awesome Autocomplete Algolia Search

Features

The Chrome extension replaces GitHub’s search bar and add autocomplete capabilities on:
  • top public repositories
  • last active users
  • your own private repositories (this one is done locally in JavaScript without Algolia: the list of private repositories remains locally in your browser)

How does it work?

We continuously retrieve the most watched repositories and the last active users using GitHub Archive dataset. Users and repositories are stored in 2 Algolia indices: users and repositories. The queries are performed using our JavaScript API client and the autocomplete menu is based on Twitter’s typeahead.js library.
The underlying Algolia account is replicated in 6 regions using our DSN feature, answering every query in 50-100ms wherever you are (network latency included!). Regions include US West, US East, Europe, Singapore, Australia & India.

Exporting the records from GitHub Archive

We used GitHub’s Archive dataset to export top repositories and last active users using Google’s BigQuery:

 

Configuring Algolia indices

Here are the 2 index configurations we used to build the search:

Repositories

Users

Want to contribute?

It’s open-source and we’ll be happy to get your feedback! Just use GitHub’s issues to report any idea you have in mind. We also love pull-requests :)

Or just want to add an instant search in your website / application?

Feel free to create a 14-days FREE trial at http://www.algolia.com and follow one of our step by step tutorials at https://www.algolia.com/doc/tutorials

FanFootage: Solving the Search Problem With Algolia

The following post is a guest post by Eoin O’Driscoll (web developer), and Vinny Glennon (co-founder) of FanFootage.com

vinny

eoin-o-driscoll

When we founded FanFootage we knew there was something lacking in the concert and event experience. Fans were taking videos on their mobile phones during performances, loading them on YouTube, and the audio was horrible. So we created FanFootage where fans can upload their own video and we work with the bands to get the high def audio and put them together. Now fans can come to our site and see their favorite concerts or sports from any angle. They can also search for upcoming events and performances.

For a recent Linkin Park concert the band and the fan group contacted us. Their fans uploaded more than 1,500 videos from almost every angle. On FanFootage that single concert has had over 350,000 page views.

fanfootage

Our user experience heavily relies on search

Earlier in our design we knew that search would be key to making FanFootage the ultimate fan experience. When a user comes to our site the first thing they do is search for the event or artist. And we need to make sure that they either find the artist they are looking for or something similar, and it has to be fast.

As developers, our team isn’t new to search, particularly within the entertainment space. Our previous startup in the music space was bought by RealNetworks and a second startup was a competitor to Google. That is where we learned that search is hard. And when we thought to build our own search on FanFootage we quickly said it wasn’t going to happen.

We also know what fans need. User demands have changed now that they can access anything from their phones. Today we expect our applications and services to predict what we are going to do next. And because of Google, people don’t search with a single phrase. Users expect search to understand how phrases fit together and are related and of course it needs to spell check and it must be instant.

We also had different search requirements than other sites. Normally search on a site is for one unit or concept; a site for flowers for example. For us we needed to allow fans to search for artists, bands, friends or upcoming events in their area and never get a zero result.

Why we chose Algolia

After looking at a few search applications we agreed on Algolia.Many search applications look nice but don’t have the flexibility we needed to configure them they way our business needed. And most weren’t fast.

Why did we chose Algolia? First it has a developer-centric approach. It took us 2 hours to configure and a day to test and that was it. We basically had search up and running in a day. The dashboard lets me know that the API calls are returned within milliseconds and we have all the flexibility we need to configure as our content grows.

Today, more than 250 artists have used FanFootage in 20 countries. We are growing quickly. As a company we are still learning what our fans are searching for and Algolia is helping us with that. As content grows we will continue to configure search to meet the needs of our fans. We will also be rolling out Algolia for mobile because of its multi-search capabilities.

Algolia is a simple solution to a complex problem. And it blew our mind away. It just works. And now we can focus on our own fanbase.

Images courtesy of FanFootage. Learn more on their website

Chapter 1 • Search inside wesites and mobile apps is strategic to engage visitors

Summary: In an economic environment where the competition for end-users’ attention and interest is fierce, overlooking search inside your website and mobile application may damage your business. Powerful and reliable Web search engines such as Google have created deeply rooted expectations for a responsive and intuitive access to online content and your users expect the same responsive experience once they access your service. Yet most websites and mobile applications still provide a frustrating and cumbersome navigation and exploration experience, supported by a poor internal search engine. Besides, people cannot stand wasting time and the Google guys got it: they’ve made moving between websites effortless. What people don’t find easily with you, Google will find it for them, and it may be with your competitors. Great site search reinforces retention but also brand awareness and customer loyalty.

End-users have high expectations when it comes to search

Google’s mission is to organize the world’s information and make it universally accessible. This is how people use the Web, they hunt for information and content. In 2004 already, according to Nielsen Norman Group (2004), people would start their web sessions with a search engine 88% of the time. This hunt for content does not stop once users access your service. By using extremely fast and intuitive Web search engines such as Google or Yahoo, users have developed well-established unconscious expectations about what great search should be: the invisible link that understands an intent and translates it into in the right answer. Users have been conditioned to rely on such responsive and supportive search interfaces.

With the ever growing amount of content online services offer their users, internal search is now more central than ever to keep up with this need for an immediate access to relevant answers. Search has become the most important UX component for information retrieval and exploration inside online services. But the gap keeps increasing between this need for a powerful internal access to content and the poor navigability of some online services. It has become so important that unconsciously, people would rather trust Google to find content inside your service than your own internal search and navigation engine.

Return On Time Invested is the search’s KPI

People see the Web as an “integrated whole” where the fundamental units are pieces of information, not websites, so it is critical for websites and mobile applications to be able to quickly surface relevant information. In such a system, expecting users to navigate complicated information architectures through endless links and tabs is simply not a viable solution.

Users optimize their time and efforts in their hunt for information (see the information foraging theory by Pirolli ). They just behave like our ancestors who looked for patches of foods, looking to get the largest benefit with the smallest effort. They exhibit a short attention span, are time-constrained and highly impatient. Thus, they will exercise judgement and pragmatic decision-making strategies in deciding whether to persevere with a given information resource or to look for a different one. The amount of time a user spends on a given website is directly proportional to the travel time between sites and what happens is a phenomenon Jacob Nielsen (2003) describes as information snacking: since information resources are often disappointing and the between-patch time decreases thanks to Google and fast Internet connections, users simply spend less time on a given website and instead multiply their options. All ecommerce websites know that usability guideline: “If users can’t find a product, they won’t buy it”. But with Google and the shrinking travel time between websites, things have changed: “If users can’t find it fast, they won’t buy it” would indeed be closer to reality.

Search is a key element of your users’ loyalty

According to a Kelton Research study conducted among one thousand American adults  (2007) on “the state of search”, 78% of those who experience search engine fatigue “wished” that search engines could actually somehow “read their minds”. Visitors need to feel understood and treated fairly when interacting with a service. If you think about it, the search bar of a website or a mobile application is a unique field where the users express their intent the most clearly. This is by far the most valuable touch point between an end-user and an online service as well as a unique opportunity to engage a user in a “digital” conversation. Not surprisingly, returning poor results when a user takes the pain to articulate his intent translates into poor retention: would you engage in a relationship with someone who constantly answers off topic? Probably not and that’s nevertheless what’s happening on the Web today. The disappointment caused by a lack of relevance unfortunately damages your credibility and your brand.

Relevance is mandatory for retention but personalization is the key to loyalty. And whereas it’s not really possible to offer a browsing interface personalized per user, an efficient search function can provide an experience tailored to the particular needs of end-users. Results of a particular query can be pushed up the search results page according to personal data gathered during a session. Search rankings can also be tweaked on a per profile basis, take into account in real-time the preferences of each user, her friends, etc.

Let’s wrap up!

Today we are in a paradoxical situation where most efforts are put on external findability, websites wanting to be immediately accessible from Web search engines. But without a strong focus on the search feature of the website to achieve a great internal findability, all those branding and search engine optimization tactics are in vain. Internal search is about organizing your own information and making it universally accessible to your own users: what Google did for the Web, you now need to it for yourself!

 

Blog post series: Why You Should Care About Search

In our next blog post, we will dig into some of the characteristics of a great search experience.

 

Blog post series: Why You Should Care About Search

Introduction: Search Experience Is The New Frontier for Online Services

Our goal at Algolia is to help websites and mobile applications maximize their users’ engagement thanks to a unique built-in search experience.

Great mission, right? … Wait a second: how can this be the solution to the most critical challenge of online services:

ui searchbox

… a blank field? Aren’t we overselling search a little bit here?

Well, search is actually a lot more than this small box, it is at the core of the user experience inside all online services. Defined in the 90’s by Don Norman, the “user experience” refers to the range of interactions an user has with a company and its services. We may see our own interactions with an online service mostly as the consumption of content but how we access this content is actually through search, how we explore it is through search. Besides, most of our online life is about hunting for information, tacitly expecting immediate relevance and instant satisfaction. Search is the underlying layer of our online hunting experience. Ubiquitous, it is the hidden enabler of our interactions with online services.

 

Blog post series: Why You Should Care About Search

In the following series of blog posts we will explore in more details:


Let’s wait no more and dive into our first chapter!

Black Thursday DNS issue

Today we had a severe DNS issue that impacted some of our users during a total of 5 hours. Although most of our customers were not impacted, for some of them search simply went down. This event and its details deserved to be fully disclosed.

The context

Up until recently, we were using Amazon Route 53 for our DNS routing needs. When we started to design our Distributed Search Network (algolia.com/dsn) a few months ago, we quickly realized that our needs were out of Route 53′s scope: we needed a custom routing per user and the two options of Route 53 simply didn’t work:

  • Latency-based routing is limited to the 9 regions of AWS and we have 12;
  • With geography-based routing you need to indicate country per country how you want to resolve the IP.

This is a tedious process for a not even good solution as route 53 does not support EDNS right now.

So we started to look for new DNS options. Choosing the best DNS provider is not something you do overnight. It took us months to benchmark several vendors and find the right one: NSOne. The filter chain feature of NSOne was a perfect fit for our use case and the NSOne team was great in understanding our needs and even went the extra mile for us by building a specific module, allowing better performance.

Something we also discovered during this benchmark was that the algolia.io domain was not good for performance compared to algolia.net, as there are far more DNS servers in the .net anycast network than in the .io one. The NSOne team offered us a smart solution based on linked domain, so we wouldn’t have to maintain two zones ourselves.

The migration

The goal of the migration was to move from Route 53 to NSOne. For several weeks we have been working on importing the records in NSOne and making sure Route 53 and NSOne were synchronized. Our initial tests revealed some issues but after a few days of continuous updates without any difference between Route 53 and NSOne, we started to be confident about our synchronization and started the migration of the demos of our website to make them target the new algolia.net domain. We tested the performance and resolution from all NSOne POP (https://nsone.net/technology/network/) to be sure there were no glitches.

These first production tests were successful, synchronization was ok, performance and routing were good, so we decided to move the .io domain from Route 53 to NSOne as well.

The D-day

The big issue when changing the DNS is that it is global and involves caching logics, making rollbacking complex. With users in 45 countries it is almost impossible to find a suitable time for everyone: DNS changes cannot be done gradually. We decided to push the update during the night for the US, at 4am EST.

We witnessed a quickly rising number of queries targeting NSOne and it’s once we reached about 1,000 DNS queries per second that we started to receive our first complain about failed DNS resolution. This routing issue was not impacting all DNS resolutions but some of them were replying with a NXDOMAIN answer, the equivalent of a DNS “404 not found”.


$ dig APPID-1.algolia.io

; <<>> DiG 9.9.5-4.3-Ubuntu <<>> APPID-1.algolia.io
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NXDOMAIN, id: 25585
;; flags: qr rd ra; QUERY: 1, ANSWER: 0, AUTHORITY: 1, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;APPID-1.algolia.io. IN A

;; AUTHORITY SECTION:
algolia.io. 45 IN SOA dns1.p03.nsone.net. hostmaster.nsone.net.
1414873854 43200 7200 1209600 3600

;; Query time: 24 msec
;; SERVER: 213.133.100.100#53(213.133.100.100)
;; WHEN: Thu Nov 27 10:49:33 CET 2014
;; MSG SIZE rcvd: 115

After double-checking our DNS zone for those specific records, we understood it was a NSOne bug related to our custom routing code. We immediately rollbacked to Route 53.

The NSOne support was really quick to react and they found the issue pretty quickly: the issue only concerned some DNS with EDNS support on the algolia.io domain. The algolia.net domain was not impacted, explaining why all the tests we've done weren't able to detect the issue before.

Unfortunately, it did not stop here and something very unexpected happened: some customers (even not priorly impacted) started to face issues right after the rollback to Route 53.

In order to improve performance, the custom Algolia module developed by NSOne was doing some translation on our records: APPID-1.algolia.io is translated into 1.APPID.algolia.io and then resolved to CNAME for the actual server in the cluster serving that customer. The translation of APPID-1.algolia.io to 1.APPID.algolia.io was done with a TTL of 86400 seconds (1 day). Since these zones did not exist in Route 53 before, it was not possible to resolve there records anymore. What made the situation even worse was the TTL far exceeding the TTL of NS records. Most of the DNS servers flushed their cache for the domain, once the nameservers changed. But the remaining ones kept the record cached.

TL;DR: Do not forget about IPv6. As if it was not enough, we eventually discovered something else: our custom DNS module was resolving APPID-X.algolia.io to X.APPID.algolia.io only in a case that there were no direct resolutions to an IP address. This translation worked pretty well as we had all the A records set. But some customers started to report weird resolutions. Normally we resolve APPID-1.algolia.io -> 1.APPID.algolia.io -> servername-1.algolia.io -> IP. Which was completely fine until the moment an IPv6 AAAA request came. Since we did not have AAAA records, the custom filter started to resolve: APPID-1.algolia.io -> 1. APPID.algolia.io -> servername-1.algolia.io -> 1.servername.algolia.io -> nothing.

We were in a bad situation feared by all engineers, this lonely moment when you really miss a "purge cache" feature.

Eventually, as soon as we got confirmation of the fix by NSOne, we changed again the DNS of algolia.io to NSOne and helped our customers to workaround the issue before the cache expiration:
for our customers impacted by the NXDOMAIN issue, a simple migration to the algolia.net domain instead of the algolia.io problem fixed the issue;

for those impacted by the Route 53 rollback issue, we created new DNS records for them to avoid work-around DNS caches.

Conclusion: what we learned

This is by far the biggest problem we have encountered since the launch of Algolia. Although the first issue was almost impossible to anticipate, we have made mistakes and should have handled a few things differently:

DNS is a highly critical component and being the first to use an external custom module was not a good idea, even if it improved performance;
Putting more thought into the rollback part of our deployment would have helped us anticipate the second issue. For a component as critical as a DNS, having a robust rollback process is mandatory, no matter how much work it represents and even though such an event is extremely unlikely to happen.

We're very sorry for this disruption. We wanted to share these technical details to shed some light on what happened and what we’ve done in response. Thanks for your patience and support.

If you think we missed anything or if you'd like to share your advice on your own best practices, your comments are really welcome.

Jadopado delivers instasearch for mobile and web powered by Algolia

Algolia Increases Online Search Sessions By 60% and Unique Mobile Searches by 270%

The following post is a guest post by Omar Kassim, co-founder of JadoPado.

Omar-Kassim-Founder-of-JadoPado Founded in 2010, JadoPado is one of the largest e-commerce sites servicing the GCC, Middle East, North Africa and South Asia.  Its CEO Omar Kassim wanted to bring an Amazon-like experience to the region.  In just 3 years of operations the company now boasts thousands of customers, hundreds of vendors and over $7 million in annual revenues.

Realizing that search is a key component of their user experience and engagement, Omar and his small team of 15 set off to build new search capabilities that would help users find the products they wanted, lightning fast. In addition, the team was developing a revamped mobile experience and saw that search needed to be spot on for both smartphones and tablets. “I saw search as a competitive tool and as a strategy to get a leg up on our competition.  After seeing Algolia on Hacker News I was absolutely blown away.  After looking at the demos, we threw out what we were doing internally in terms of a small search revamp and I had one of our team get cracking with Algolia right away. As a little startup, it really helped that Algolia’s price points were within reach in terms of not breaking the bank to get things rolling.”

The Power of Instant Search

After configuring and testing Algolia for two weeks, JadoPado had the results they were looking for. Branded internally as InstaSearch, JadoPado knew that it would dramatically improve how search functioned on both mobile and the web at JadoPado. “The idea from the outset was to build InstaSearch. I kept ending up at the Algolia demo and thought it would be incredible if we could forget all user interaction aside from typing and just display results right away. Remove what you’ve typed and the results disappear taking you back to where you were. We then spent a bit of time figuring out how to get each result “page” to have a URL that could be used with external search or shared elsewhere,” explained Omar.

japopado ecommerce

Making Search Seamless

“We looked at a number of solutions. One of our biggest intentions was to try to get search to be extremely fast and as slick as possible. Customers should feel like search “just works” and that it is a super easy way to get straight to to whatever they may be looking for. Algolia has allowed us to accomplish that,” Omar explained.  Moving search from a not really working internal model to a search as a service platform has allowed us to focus on other areas while knowing that search works and that we’ve got an edge over our competition.”

Support For Arabic

With more than 20 countries to support, the JadoPado team knew that the key to success in the region was to ensure that search be delivered in Arabic as well. Omar explained, “The final bits were figuring out a separate set of indexes for Arabic (as we were about to roll out a standalone Arabic version of JadoPado) and getting the faceting right. This was easy to do with the deep Algolia documentation.” Algolia works with all languages, including Chinese, Japanese, Korean, and Arabic. No specific configuration required, speed and ranking perform exactly the same way.

Better Business Through Search

In May the team rolled out InstaSearch, Arabic support and a newly revamped mobile experience with search at the center. JadoPado immediately experienced a doubling in conversions and activity that was triple a typical day. Compared to the same 30 day period in 2013, JadoPado saw an increase in site visits through search from 8.2% to 11.3%.

Additionally:

  • Sessions with search has jumped 59.96%.
  • Unique searches has jumped 46.87%
  • Average search depth has increased by 58.87%.

Mastering Mobile Through Search

The greater impact of Algolia’s hosted search was JadoPado’s revamped mobile experience. Search is often the first action customers take on a mobile device.  With instant search, autocorrect and full language support,  improving search and the quality of results can have a significant impact on revenues.  With Algolia implemented as part of JadoPado’s mobile site, the company saw strong results with visits from search increasing from 4.3% to 15% over the same time period and session exists decreasing by 16.57%. A big change. And search increased engagement on all levels:

  • Mobile sessions with search jumped by 233.92%
  • Total unique mobile searches jumped 268.37%
  • Average search depth on mobile devices jumped by 41.05%.

Images courtesy of JadoPado. Learn more on their website.

AfterShip Leverages Algolia’s Search as a Service to Track 10 Million Packages Around The World

Algolia Speeds Up Search Result Delivery Times From 10 Seconds To 250 Milliseconds

The following post is a guest post by Teddy Chan, Founder and CEO at AfterShip.

Teddy Chan AfterShip AfterShip is an online tracking platform which helps online merchants track their shipment across multiple carriers and notify their customers via email or mobile. Being an online merchant myself, I shipped more than 30,000 packages a month around the world. When customers contacted me to get an update on shipments I realized that I couldn’t track shipments from different carriers and get updates on their status in a single place. So I built Aftership to allow both consumers and online merchants view all their packages on a single platform.

After winning the 2011 Global Startup Battle and 2011 Startup Weekend Hong Kong Aftership opened into beta and quickly helped thousands of online merchants to send out over 1,000,000 notifications to customers.

One of the key parts of our service is providing customers around the world with up-to-date information about their packages.
Right now we have more than 10 million tracking numbers in our database. This causes a few different challenges when it comes to search and we needed technology that would help us continuously index constantly changing information.

Our first challenge is that we are a small team with only 1 engineer.

We are not in the search business, so we needed a solution that would be easy to implement and work well with our existing infrastructure. Algolia’s extensive documentation made it easy to see that our set up and implementation time would be extremely fast and would work with any language and database, so we could get back to our core business.
Algolia was super easy, we had it tested, up and running in a week.

Our second challenge was quickly delivering search results.

On Redis, searching for packages was simply impossible. For each query, it would simply lock up until the result was found, so it could run only one search at a time. Each search with Redis was taking up to 10 seconds. With Algolia we reduced search result delivery times to 250 milliseconds for any customer anywhere in the world. When you think about thousands of merchants who send more than 1 million packages per month, you can see how speed is critical.

Downtime also is not an option when tracking packages around the globe.
We are very strict when adopting new technologies and SaaS technologies can’t slow down our system.
Algolia had the highest uptime of the other solutions we looked at. There was no physical downtime.

Our final challenge was search complexity.

Sometimes you need to know how many shipments are coming from Hong Kong and exactly where they are in transit to and from the U.S.. Shipments going around the globe can change status several times within a single day. With Algolia’s indexing we are able to instantly deliver up-to-date notifications on all 10 million packages, so that customers can not only track their package on its journey, but they can also go to their online merchant’s shop and see a real-time status of their package.

In the end, it was Algolia’s customer service that won us over.
Similar services and platforms were not responsive. With Algolia we either had the documentation we needed, immediately were able to get advice from an engineer or had our problem solved in less than a day. With such a small team this means a lot. And with the Enterprise package we know that Algolia will grow with us as quickly as our business does.

Want to find out more about the Algolia experience ?
Discover and try it here

Our 4th datacenter is in California!

Do you know the 3 most important things in search? Speed, speed, and speed!

At Algolia, we work at making access to content and information completely seamless. And that can only be done if search results are returned so fast that they seem instant.

That means two things for us: getting server response time under 10ms (checked), and getting the servers close to end-users to lower latency.

We are on a quest to make search faster than 100ms from anywhere in the world, and today is an important step. We are thrilled to announce the opening of our 4th datacenter, located in California!

Choose Datacenter

You can now choose to be hosted on this datacenter when signing up (multi-datacenter distribution is also available for enterprise users).

 

Concertwith.me’s Competitive Edge: A Revamped Search UX with Algolia

There are a lot of music discovery apps on the market, yet sifting through concert listings is anything but seamless. That’s why Cyprus-based startup Concertwith.me aims to make finding local concerts and festivals as intuitive as possible. Automatically showing upcoming events in your area, the site offers personalized recommendations based on your preferences and your Facebook friends’ favorited music. Covering over 220,000 events globally, the site uses Algolia to offer meaningful results for visitors who are also looking for something different.

Founder Vit Myshlaev admits that concert sites often share the same pool of information. The differentiator is how that information is presented. “The biggest advantage one can have is user experience,” he explains. “There’s information out there, but do users find it? The reason that people don’t go to cool concerts is that they still don’t know about them!”

As an example, he showed me one of the largest live music discovery sites on the web. Searching for an artist required navigating a convoluted maze of links before pulling up irrelevant results. “Users have to type in queries without autocomplete, typo-tolerance, or internationalization. They have to scroll through a long list of answers and click on paginated links. That’s not what people want in 2014,” said Myshlaev.

To simplify search and make the results more relevant, Concertwith.me used our API. “We got a lot of user feedback for natural search,” Myshlaev wrote. Now visitors can search for artists and concerts instantly. With large user bases in the United States, Germany, France, Spain, Italy, Russia and Poland, Concertwith.me also benefits from Algolia’s multi-lingual search feature. “We’ve localized our app to many countries. For example, you can search in Russian or for artists that are Russian, and results will still come up,” says Myshlaev.

search

For users with a less targeted idea of what they’re looking for, Concertwith.me implemented structured search via faceting. “We also realized that some visitors don’t know what they want. Algolia search helps them find answers to questions like, Where will my favorite artist perform? How much do tickets cost? Are there any upcoming shows?”

recommendations

Concertwith.me’s goal is to reduce informational noise so that users can find and discover music as soon as possible. The start up experimented with a number of other search technologies before reading an article about us on Intercom.io, which inspired Myshlaev. “When I saw what Algolia could do, I knew that this was the competitive edge I was looking for.”

Want to build a search bar with multi-category auto-completion like Concertwith.me? Learn how through our tutorial.