In a quest for use cases for Artificial Intelligence (AI) in the public sector or public services, I reviewed some of the recent discussions and research papers.
Points of note included:
- A lot of academic institutions have studied this and have some great recommendations.
- Implementation of recommendations is happening worldwide at a pace affected by many different factors.
- The big bang for Public Sector is in making data available through “Open Data” initiatives.
- Thus enabling IoT services, and providing data for others to analyze to their own use cases.
- The most important ingredient is Data.
The Dean and CEO of the Northeastern University, P.K. Agarwal built his career fostering the industry interactions particularly in the Public Services space. In his view public sector services are going through a growing up process which he terms as eServices PK’s Ladder 2.0.
PK explains that the public-sector services today have actually grown faster than the initial hypothesis and in some regions the services are in the “Integrate” phase already. He has demonstrated his “CAL-BOT” a mobile application that integrates with voice, payment gateways and allows the phone user to transact with various public utilities including DMV, courts etc.
On Wednesday May 3rd 2017, the audience at New York Smart Cities conference witnessed the city of Los Angeles’ new, unsung, hard working employee – CHIP. CHIP is an acronym for City Hall Internet Personality.
The robot CHIP is built on the fundamentals of continuous learning and integration between multiple city departments. It acts as a personal assistant for visitors on the website.
And to carry the story forward, there is a surge in the number stories across various online and print magazines about the use of public service chatbots – with help from well known names – Microsoft, Facebook and even Salesforce!
Local governments in and around Chicago have started to use Chatter – a collaboration tool from Salesforce. This collaboration helped various public services departments to understand the bigger picture of activities planned or performed outside of their own departments. This helped them save tax payers dollars and earned respect in the eyes of residents.
On May 9, 2013, President Obama signed an executive order that made open and machine-readable data the new default for government information. With this initiative from the White House, a whole new perspective arose in terms of Open Government and Open Data.
A lot more public services data is now available for integration and consumption. This is only going to catalyze the collaboration, integration across the various services. At the same time, this triggers a new perspective on the ability to store and manage the data. Applying the machine learning algorithms is not too far, yet is still not there.
Most of the public services innovation has been limited to large cities or state governments. A lot of smaller, local and state governments have been slower to adapt to the new technologies. In my view, it should be the other way around. Smaller governments have less complexities, simpler systems and easier ways to test services. They should be the first ones to adopt the new technology and play a role as a pilot.
Often the lack of budget and infrastructure is stated as a deterrent in moving forward. OpenText and multiple other IT service providers are trying their best to cover this gap through cloud services. OpenText Cloud Services allows the organizations to leverage the IT infrastructure to manage their data and provide services to their own clients.
While some governments have started to understand the need, others are still awakening to it. As many speakers on this topic have already said – “The future has already arrived”. So hurry!