Connectivity & Data
Governance and Citizen
Energy & Environment
In the latest SmartCitiesWorld podcast, Graeme Neill speaks to Alexander Zschaler, regional sales director at Cloudera Germany.
This podcast is in association with Cloudera and IBM Power Systems. Cloudera has been an IBM business partner since 2016 and together they have built a solution that unleashes the insights of big data to power the digital enterprise. Across the globe, customers span financial services, retail, telecommunications, manufacturing and government and tackle use cases such as threat prevention, customer insights and process optimization.
Data is obviously at the heart of all of our lives, driven last decade by the smartphone boom. The years ahead will see data on a huge scale being produced by millions of different kinds of sensors. This data can be rich and power hitherto unforeseen use cases, but in order to work, it needs to be processed effectively. And that’s where big data and machine learning comes in. SmartCitiesWorld recently spoke to Alexander Zschaler, regional sales director at Cloudera Germany, to find out more.
Below are some edited highlights from the discussion. For more detail or to listen on the go, download or stream the podcast.
Graeme Neill, editor, SmartCitiesWorld: Could you begin by introducing Cloudera and your position within the industry?
Alexander Zschaler, regional sales director, Cloudera Germany: Cloudera believes that data can make what is impossible today possible tomorrow. Data drives our lives, how we work and how we are connected with each other. And this data has a special trait and this special trait is that it is growing rapidly everywhere.
However, there’s a huge demand for a platform to get use of this data and to make this even more productive. And this is one aspect Cloudera is focusing on to deliver exactly this kind of data platform. On the other side is our mission that we like to empower people to transform this complex data into clear and actionable insights they could use to develop new services or even judge decisions based on the support of machine learning and AI.
What is very special to us is that we are completely agnostic to deployment method. This means if you’d like to run this on premise, not a problem at all, if you want to run this on cloud or on private cloud or public cloud, or even hybrid clouds, it’s totally up to you.
Cloudera on IBM Power Systems delivers fast insights and reduces infrastructure costs for data-at-rest as well as data-in-motion projects. We remain leaders in open source with joint commitment to open standards, providing a powerful combination of open software and hardware that delivers innovative speed, without vendor lock-in.
Neill: Why are big data and machine learning so important to cities right now?
Zschaler: Every data will be big data sooner or later, especially for federal agencies, state offices and city offices. The question is, how can they leverage this data into insights and put this to work? This requires a modernised infrastructure that can match the ever-growing flow. We will see 157 zetabytes of data by 2025, of which 80 per cent will be unstructured. Machine learning and capabilities like AI can help to spot these problems before they occur, leading to a smooth city operation.
Data alone does not make things smarter. It’s the end to end from edge to AI collecting, managing, analysing of this data, which is core to modernising the approach of city.
Data alone does not make things smarter
Neill: What are the kinds of data that are being produced today in cities that they can take advantage of?
Zschaler: Smart cities continuously strive towards making the right data available to the right people at the right time to help build solutions to solve these complex urban challenges. At rock bottom, the answer would be any kind of data you could imagine, right?
Let’s take the example of the smart waste bin and how we get from sensors into real benefits for the citizen. The approach would be - and how we [at Cloudera] can fix this - that these sensors could talk to each other and check if any other bins close by are nearly full. And then we can optimise these routes automatically based on machine learning capabilities.
However, that’s still not the end. And it’s just the beginning of a beautiful journey from our point of view. Additional analytics can now be put on top of the data we have in this example. And they can lead into better planning of decisions - where do we need more of these bins? Simple question, but how to answer if you don’t have the database for this or better staff etc. All these aspects, what do we actually need where and when, could be answered by this kind of integrated approach.
Neill: Why aren’t cities able to deal with data effectively today?
Zschaler: It’s from our experience and talking to different governments or agencies in more than 40 different countries, that it’s common public sector data challenges, and the majority can be condensed down to five aspects. Number one is you have legacy systems and they are simply not equipped for today’s volume, variety and velocity of data. Point number two is you have silos everywhere. Every agency is keeping their own data due to several reasons. Number three is security and governance. We need to be secure from every angle.
Four is that you also have a bit of cultural aspects to consider. What we mean by that is maybe a limited wish to collaborate between the different stakeholders in a project. Finally you have the situation that cloud has arrived, I would say, in the head but it’s not in the execution so far. They do not really know how to execute on that.
Legacy systems are simply not equipped for today’s volume, variety and velocity of data
Neill: Why are open platforms so important? What are the other benefits if cities using big data and machine learning to handle their data?
Zschaler: The open platform, especially with smart cities, is a very important topic. Smart services create a multi-dimensional and multi-layered environment with multiple stakeholders. Open platforms are key because of the economies of scale aspect. Transparency and accessibilities to data sets are crucial to make the service successful and use the economies of scale in terms of leveraging the same dataset for different services. This will drive innovation and other aspects for citizens.
in association with: