Recently I had the opportunity to present at the Google Data Cloud Summit. This conference focused on how the next wave of data solutions can allow organizations to make smarter decisions and solve complex challenges using AI, machine learning, analytics, and databases.
So (surprise), I described how SpringML partnered with Google Cloud to allow the State of Hawaii to make smart decisions and solve a very complex challenge using AI, machine learning, analytics, and databases. If you missed it, you can watch the on-demand video.
To summarize here, our solution helped the State of Hawaii safely restore tourism during the pandemic. Tourism is the single largest component of the state’s economy; back in 2019, over 10 million visitors generated $18 billion in spending before the pandemic initially shut down all travel. The Governor and the Office of Enterprise Technology Services quickly defined requirements for a system that would re-enable tourism, ensure public safety, and support law enforcement while presenting minimal inconvenience to travelers. Using scalable and fully-managed technologies on the Google Cloud Platform, SpringML was able to build a suite of robust applications and go live with a soft launch for aircraft crew in just 3 weeks, and the full public launch 3 weeks later. In order to manage the increasing volume of travelers returning, SpringML added extensive automation and analytics to automate screening of health documents, simplify traveler check-in processes, streamline the screening processes at the airport and provide extensive data visualizations to state decision-makers and the public.
A few quick stats highlighting the success of the system
- The Safe Travels system managed every trip to Hawaii for 18 months – over 12 million visitors, during which daily visitors increased from fewer than 5,000 per day back to pre-pandemic levels of 30,000 visitors per day
- SpringML trained Google’s DocAI to scan Covid test result forms from over 40 medical providers, and later even handwritten CDC vaccine cards. DocAI scanned over 4 million uploaded forms, verified test results, and matched profile data to users with no human intervention
- Google’s DialogFlow Call Center AI processed over 250,000 traveler check-ins using automated voice calls and SMS message conversations
- Comprehensive public Looker dashboards provided data transparency to the public
But back to the Google Data Cloud Summit itself! As a regular attendee, as well as a presenter, I was struck by the astonishing rate of innovation on the platform! There were rapid-fire announcements of impressive new features to existing platform components as well as an introduction to the new technologies coming soon. This event really highlighted the Google culture of rapid innovation, the powerful benefits of ever-expanding integration between components on the Google Cloud platform, and the how pervasive Machine Learning is becoming as both a tool for developers (and non-developers!) to apply as well as benefit to end-users via application features.
My 5 takeaways
- “Democratization” was a consistent theme across technology offerings. While the word caught my attention as an awkward usage in this context, the concept did resonate – reduced costs (often starting at free) for various technologies allows organizations to get started using a technology for free in a development environment, and potentially still free in production at low volumes. That epitomizes the concept of “cloud” – pay only for your actual consumption, and only after exceeding a generous free tier
- The announcement of BigLake (currently in preview) is really exciting, allowing organizations to unify disparate data sources with cost-effective storage, while being able to leverage all of GCP’s machine-learning and AI capabilities at Google scale, including BiqQuery’s data management and optimization functions – business intelligence, machine learning, and governance. We work with customer after customer desperately trying to de-silo their data, so a central data lake spanning multiple vendor’s clouds will be really powerful to democratize (now I’ve started using it!) all of an organization’s data to all of its functions
- The power and massive scalability of Google’s Cloud Spanner database is now available to start at a small scale ($65/month), and with committed use discounts. In the past, I’ve seen projects with only moderate scalability needs choose a different database due to Spanner’s cost, so being able to use a fully managed, unlimited scale (!) relational database will be really powerful
- VertexAI is expanding in very cool ways to become an entire “AI factory” ecosystem. The Vertex AI Model Registry makes it easy to manage and deploy models (including BigQuery ML); the Vertex AI Marketplace will enable discovery of pre-packaged and curated building blocks; and Vertex AI Pipelines (MLOps!) will let you serverlessly orchestrate ML workflow to automate and monitor repeatable processes
- Machine Learning is becoming pervasive. Some of that is the continuously expanding ML tooling for developers, such as the VertexAI ecosystem I mentioned in the last bullet, and expansion of BigQuery analytics to BigLake. But Google itself is applying ML under the covers to significantly improve application features, such as being able to ask natural language questions about Big Query data using a Connected Google Sheet
At SpringML we love to solve data, machine learning, and analytics challenges for our customers. The Google Data Cloud Summit was an exciting reminder that the continuously-expanding breadth and depth of technologies on the Google Cloud Platform gives us an increasing arsenal of tools to solve our customers’ challenging problems at scale.