Daniella Traino, Pinecone Technology Strategies

Connectivity in the built environment has many benefits, but the reality is that smart buildings are collecting data that could be used in nefarious ways in the future and it’s hard to plug the entry points for infiltration. 

Read the full report.


At Tomorrowland19, Pinecone Technology Strategies’ data and cyber security expert –Daniella Traino – offered her thoughts on the potential opportunities and challenges of the digital era.

To set the scene, Traino put some of the assumptions about our digital world under the microscope.

Connectivity is a buzzword in the tech world but she says we really aren’t as connected as we like to think. “What we are is internet connected.”

And as a consumer-led society this Internet connectivity has been all about trying to reduce friction in our lives.

Daniella Traino, Pinecone Technology Strategies

“We are building capabilities and tech that expect an immediate response and immediate reaction. But what we’ve done as a result is create a culture of instant gratification.

And it’s not just Millennials. “My parents are two generations older than that and are getting into that space.”

The other trend is personalisation, which relies on algorithms that determine your personal interests and needs.

Traino pointed out that algorithms are not new. “We have been an algorithm economy from the 17th century, a Persian mathematician came up with the idea.”

She says that algorithms need data in order to personalise the experience, and increasingly, it’s all about getting this data from “things and people”.

These advances in connectivity and data collation provide opportunities to solve complex challenges in our society, such as in health and mobility. But these technology advances also create new risks.

The threat to security

From a cybersecurity perspective, highly connected pieces of technology are also incredibly vulnerable because they open up thousands of new entrypoints.

An automatic car, for instance, needs huge amount of data from its environment as well as other cars to be able to move around safety.

“It’s really hard to write secure code, and it’s not taught in universities very well.”

She says these cars are also collecting data on the human, and “not just the phone playing Spotify or mapping where you are going, but also collecting your fatigue level using facial recognition, and your heart beat.”

This ambient data may not be collected for nefarious purposes in the first instance but there’s no certainty as to what will be done with this information down the track.

From a software perspective, Traino says around 4 per cent of a line of code has a vulnerability in it.

“It’s really hard to write secure code, and it’s not taught in universities very well.”

She says it’s only a matter of time until someone will identify these vulnerabilities and use that to shut down these systems.

The built environment is no different

The same goes for the built environment.

Connectivity in the built environment has many benefits. Traino spoke of a project where smart technology was put into buildings so that visual sensors would detect when a tenant would walk in carrying shopping bags. The sensor would then light the way to the kitchen, and would even know if the tenant was carrying fresh food or not and control the temperature accordingly.

This is a great outcome convenience but like the car, smart buildings are collecting data that could be used in nefarious ways in the future.

Facebook recently invested in using AI driven machine learning that taps into brainwaves so that instead of typing or saying what you want, it picks up the brainwaves and converts it into text. 

“So when you are building this into your infrastructure, you need to build trust… Half the time as individuals we don’t know if we have given consent, or if we have choice, and if that choice is clear.”

Humans are next to be wired up

Traino says that humans are becoming increasingly wired.

Although this might help us become healthier, this tech comes with a level of responsibility and the need for consent.

For example, Facebook recently invested in using AI driven machine learning that taps into brainwaves so that instead of typing or saying what you want, it picks up the brainwaves and converts it into text.

This has undeniable benefits for the disability sector, but “you should imagine if there is something understanding your brainwaves, it can it also do the reverse.

Personalisation can be used to manipulate people in ways you’d never expect. One example is emitting scents in stores that you like, such as vanilla or jasmine, to promote happiness and stimulate you to spend more.

“People working in cyber security, much like marketers, are good at finding the dark side of things,” she joked.

From a marketing perspective, personalisation can be used to manipulate people in ways you’d never expect. One example is emitting scents in stores that you like, such as vanilla or jasmine, to promote happiness and stimulate you to spend more.

There’s even sensors out there now that can measure your heartbeat – and whether you’re stressed or flustered – from a few metres away.

Not everyone is cyber and data informed

The problem is that the average literacy level is equivalent to a five grader, so it’s challenging to explain what it means to give consent, and provide that choice in a transparent and clear way.

“I’m cyber informed but most of the community isn’t.”

Interestingly, the way in which we trust is changing. Traino says we trust digital assistants more than we trust humans, and raises questions about the importance of being able to discern the difference between a person and a human.

The solution to more sustainable, responsible AI needs to be global.

Every country has released principles around ethical AI but she says this nation-by-nation approach to implementing “guardrails” is “such a western mentality.”

“In order for us to have better resilience in our societies and what we are building, and ensuring that we don’t build something we won’t be proud of, we need to work at a level beyond just one country.

“You cannot put guardrails on to instruct how AI should be built or implemented just in Australia and think the rest of the world will follow.”

Traino advocates for a universal approach “so the whole world agrees that this is the best way to responsibly use this tech, and then implement it based on each region’s cultural needs.

“But we’re not thinking that way.”

It’s all happening too fast

One of the dangers is that technology is advancing so fast that the ethical and legal frameworks aren’t keeping up.

She says when we don’t have enough time to make decisions “we will trust whatever we get and just move on. “

“We just don’t have the time, and that’s quite a dangerous situation to be in when we are effectively on an information highway.

“Just because we can trust a technology, doesn’t mean we should. The reason is because some of this AI tech is very brittle, let alone the ethics and responsibilities behind it.”

She says AI is fed data from us and is therefore a reflection of our society, and “sometimes we get it really wrong.”

When it goes wrong

From a cyber security perspective, data breaches are a real threat. In 2018 the Marriott hotel chain IT systems were hacked with hundreds of millions of customer records, including credit card and passport numbers.

Malicious ransomware attacks can also shut systems and organisations down completely – including the likes of Cadbury’s chocolate factory in Hobart when it was affected by the “Petya” ransomware attack in 2017.

Data breaches can also have devastating commercial consequences. For example, a Chinese aviation endeavour has been suspected of trying to steal data and intellectual property from leaders in the aviation industry in an effort to narrow China’s technological gap.

Part of a system of systems of systems

Traino wanted to show the audience that “we are part of a system of systems of systems, whether we realise it or not.

“And what we’re doing is creating this connected ball of thread, and when it snaps on one side, it effects the relationship of the whole.

“The challenge is what we are now creating is a borderless system where what happens in one country can have a ripple effect in another, what happens in one building can have a ripple effect in multiple communities.

“And cyber and privacy, or lack thereof, can create some havoc.”


This article is part of Tomorrowland19 – I, human special report, read the full report here.

Leave a comment

Your email address will not be published. Required fields are marked *