WHEN BLOCKCHAIN MEETS BLADE RUNNER

Kazuaki Ishiguro, Chief Blockchain Architect, Connectome.

Portrait of Tammy Strobel

How is Connectome looking to change AI and the data sharing paradigm today?

At the moment, AI is heavily dependent on data, such that whoever has more data is going to have a more precise and capable model. Facebook and Google and Amazon provide free services that keep people coming back. But at the end of the day, the data they collect is used for marketing reasons in order to drive sales. And of course, there’s nothing wrong with this.

Unfortunately, one problem with this approach is that a startup that doesn’t have the same resources as these companies will not have any data sets to train its AI on, which means that it’ll also have more trouble attracting customers. On top of that, it’s very expensive to store all the data you need to train the AI, especially after factoring in the costs needed to protect that data. And even then, Facebook has already suffered some high-profile leaks.

Ultimately, we want to provide people with more choice while allowing them to keep the data sets with them. Companies like Google and Facebook have so much customer data on their hands right now, and we want to change that.

Where does blockchain technology factor into this vision?

Most people associate blockchain with cryptocurrency. But at its core, blockchain is really about having the right to store your own data and being able to prove that the data is yours. Today, you store all your private data on Google’s or Facebook’s servers, which means that it can potentially be seen by employees at these companies. Take Amazon’s Alexa smart speakers for example. An employee could just copy a conversation, without encryption, to another worker and ask how to improve Alexa’s response based on what was said. In other words, when you put a smart speaker in your living room, someone could be privy to your entire conversation.

However, blockchain lets you keep the data on the client side, instead of sending it to someone’s database. Your data remains on your device and is not shared with a third party. That’s partly why companies like Samsung have been trying to implement blockchain technology on their mobile phones.

With respect to AI, blockchain could be used to facilitate so-called data transactions. Your data is stored with you and encrypted with a private key, but if someone wants your data, they pay you with that cryptocurrency.

Right now, companies are effectively using people’s private data for free, and they’re using this to sell more products. I think that’s wrong, and blockchain could be a possible solution.

Data should be stored locally with the client, and people should be paid for sharing their data. Connectome is also working on Virtual Human Agents (VHAs).

How are these different from voice assistants like Google Assistant and Amazon Alexa?

One big difference is the ability of a Virtual Human Agent to provide visual feedback, such as facial expressions. Smart speakers are basically just disembodied voices right now. When you use a smart speaker, it’s difficult to care about the AI on it and even think about treating it nicely or like a friend. They are not capable of displaying any emotion, so it won’t get mad even if you call it stupid. That means that communication is effectively one-way and there’s no possibility of establishing a relationship of any sort.

However, if you had more visual and emotional AI, people would start treating them more nicely, or at least more in line with how they would treat a real person. The data collected from these exchanges would then be closer to how people interact with each other in the real world. This is great for facilitating more natural conversations, which in turn can result in higher quality interactions.

At Connectome, some of our developers come from a game development background, such as Final Fantasy. We’re looking to take the expertise that comes from modeling expressions on characters in game and apply it to our VHAs.

Where are you getting your data from right now and how else are you looking to bolster your security and privacy credentials?

We use public data shared by Google or universities. Moving forward, we also hope to improve transparency. For instance, there’s no way of knowing what data Google or Amazon is using to train their AI. It’s sort of a secret sauce for them. However, when we develop our VHAs, we will point out what data sets we use, which are in turn stored on the distributed blockchain network. This is important for accountability as well, since people will always be able to trace the data sets that were used.

What applications do you foresee for VHAs?

We are currently doing some proof of-concept work in a Japanese advertising agency, and we’re collecting data on how people react and communicate with the virtual agent. Another scenario we experimented with was a coworking space, where the VHA helped people with questions such as what the Wi-Fi password was or general directions around the office.

In the future, we hope to put a VHA in cars, which might tie in nicely with autonomous driving if it ever catches on. The VHA would be able to have a conversation with the passenger and capture their needs. It might even be able to communicate with other agents used in stores or other service providers. For example, if the rider was hungry or thirsty, the agent could talk to other agents, make a decision about where to go, and then tell the agent onboard the car to stop at a certain restaurant for a drink. Everything is moving toward greater autonomy, so VHAs could see extensive use in the service industry.

Will we ever see something like Joi from Blade Runner?

That’s the end goal. Compared to robotics, virtualization is far more cost effective and versatile. Robots are generally good for specific tasks, but a VHA will need to handle various things and adapt dynamically. At the moment, we have certain augmented reality experiences that let you see the agent through your smartphone camera, but we’re also going to do a hackathon with Microsoft in China using the HoloLens, so we could see AR glasses come into play as well. Unfortunately, that particular usage is hindered by existing cellular networks, which are not yet as fast as we need them to be. But as 5G networks roll out and speeds improve, it could soon be possible to transfer huge amounts of data in real time and make putting VHAs on AR glasses more feasible. When that happens, imagine if you could gather all your friends’ agents and head out together.

There’s also not going to be a one-size-fits-all approach to VHAs. There will first be a template, but we will also make available an SDK that will let developers add different looks or features to the agent to suit their needs.

 
My Reading Room

"We want to provide people with more choice while allowing them to keep the data sets with them."

"In the future, we hope to put a VHA in cars, which might tie in nicely with autonomous driving."

Photo Phyllicia Wang