Open Data in Finance Conference: Chair’s Welcome
Here is the script of the Chair’s remark at the opening of the Open Data in Fianance Conference in London (June 15, 2016)
09:00 – 09:10
Hello. Welcome to the Open Data in Finance Conference. I am really excited to be a part of this conference and looking forward to learn a lot myself.
You might wonder why I am chairing today. That is a good question, but my guess is that I happen to be one of the more well known figure in the technical community around API access management, and I have a financial background since the very beginning of my carrier some 27 years ago.
If you are technical, and is working on user authentication and API security, then, you probably have seen my name. My name is on some of the most used specifications such as JSON Web Signature, JSON Web Token, OAuth PKCE, and OpenID Connect.
However, since this is an end user driven conference and not technical, let me talk a little bit on what I have been doing.
I work for a company called Nomura Research Institute. For the people in the financial sector especially in the investment banking, the name “Nomura” probably would click. Yes. We used to be a subsidiary of that Nomura. Back then, I was working in the HQ of the Nomura Securities researching the stock market using variety of numerical methodologies including back-propagation neural network, which is called deep learning these days. At the time, collecting data was non trivial. I had to travel through the major exchanges and data providers in Europe to buy the data on tapes. Analytical infrastructure was also a problem. I ended up buying 30 sparc stations to run the calculations. That meant a lot of maintenance. How I envy those people now who gets data online and leverage fast hardware and cloud for the analytics.
Then, in 1993, commercial internet started. It was slow, but you could connect through dial-ups. Connecting your desktop application to a data endpoint to suck the data and processing them started to look practical. So, bunch of people started to work on a standard that would allow such thing in 1996. It was called OFX, and I was one of them, and that was my entry point into the world of standardization.
Now, I am a full-time standardization officer. My specialty is digital identity and privacy. Identity in my world means set of attributes related to an entity. Thus, bank account data by definition is a part of digital identity as other personal data are. Financial instruments are another kind of entity, and any data and metadata related to them are also identity. I now work in such international standardization bodies as ISO, IETF, and OpenID Foundation, where I preside as the chairman of the board, by the way.
So, what happened to the API initiatives such as OFX? It had certain amount of adoption especially in the United States, but is far from being ubiquitous. I suspect there were multiple reasons, but use of SOAP, XML, and especially XML Signature were some of the reason for the failure, just like it was the reason in other APIs.
After 20 years, though, we may be standing at an inflection point.
From the technical point of view, we now have REST, JSON and JSON Web Signature, JWS, which I co-wrote. On the API access authorization side, we now have OAuth 2.0 authorization framework, which by the way, is composed of many specifications. On the Identity and authentication side, we have OpenID Connect, which is gaining a lot of momentum lately.
Developers loves them. When I stroll down the hallway of technical conferences, many people come to me thanking how it is easy to use and fast, as I am one of the authors of many of these specifications. I was in the Cloud Identity Summit last week, which was held in New Orleans, United States. Couple of people did the same. A person from a famous electronics company who is big in the IoT world now came to me and thanked me that they have now over 600 OpenID Connect enabled applications. Another person from one of the biggest bank in America, told me that they are converting the applications to OpenID Connect, and thanked me that it was so easy as well as very fast – 7 to 8 times faster . Many governments are deploying them as well. France Connect, as the name suggests, is using OpenID Connect. Japanese government will be releasing an API gateway in 2017, using OAuth and OpenID Connect as well. On the telecom side, GSMA’s Mobile Connect uses the same, as the name suggests. And, major cloud providers such as Google, Microsoft, Salesforce, and Amazon bases their infrastructure on OAuth and OpenID Connect. If you use Android phone, then you are using them without knowing. In fact, large percentage of the mobile Apps do use OAuth. It is so ubiquitous now.
So, one of the major barriers were removed.
From the financial industry point of view, we also see a lot of changes lately.
In the United States, bunch of banks got together at FS-ISAC to produce a specification called Durable Data API, DDA, last year. It is using JSON/REST/OAuth 2.0. OFX has released a new version, OFX 2.2 this year, after 10 years of dormancy, again using OAuth 2.0.
On the European side, we have seen a big legislative push: Payment Service Directive 2. With it, in UK, The Open Banking Standard was released this January 1. It calls for Governance mechanism together with Data, API, and security standards to be established using JSON, REST, and OAuth/OpenID Connect.
Coincidence? Well, I suspect great minds think alike.
Ladies and Gentlemen, please welcome two of the great minds, the chairs of the Open Banking Working Group, Matt Hammerstein, Managing Director, Head of Customer and Client Experience,Barclays, and Gavin Starks, the CEO of Open Data Institute.
Armchair Chat: How Open Data & Data Sharing can Shape the Financial Services Industry
This high-level discussion between a select group of industry thought-leaders will examine:
- Key outputs from the open banking working group, and practical next steps that will ensure this continues to be driven forward
- Understanding what sort of data will be opened up, and what products and services will emerge as a result
- Realistic timelines for implementation, both in the UK and Europe
- How data can be made available across the industry in a harmonised way
- How organisations can put in place long-term strategic plans, while there is still ambiguity about the regulations that will govern this change