Data Visualisation
The key technologies that makeup Novon’s Data Visualisation portfolio are: Descriptive / Predictive / Prescriptive and Augmented Analytics, Business Intelligence (BI), Artificial Intelligence (AI), Machine Learning (ML) and Streaming Data.
Data visualisation is playing an increasingly important role in identifying how we can improve our communities, our businesses, and our world. Data visualisation empowers people to make better decisions faster, and if you look at the fastest-growing companies in the past 5 years, chances are they have adopted near real-time data visualisation technologies to help them respond to and exceed customer expectations.
Digital transformation efforts, particularly in the realm of data and analytics, were once the sole responsibility of the CIO and their team. Today that paradigm is changing, as data and analytics is more tightly woven into the business and its goals. Today all executives at the C-level are or should be committed to treating data and analytics as a shared responsibility, and functional leaders are expected to empower their employees with the data and the skills they need to do their jobs.
Data visualisation will continue to be an even more important component, especially as more operations and services move into the digital space. The potential impact of that data will only get stronger as increased automation, AI, and forecasting models help us better predict and prepare for what is ahead. Today, those who have taken the initiative to shift to a digital-first mindset, driven by data, are better prepared to satisfy and exceed customer expectations.
So, how do we best achieve this practically, and what technologies would best suit us to do this? The technologies that Novon uses from their toolkit are all forms of Analytics, Business Intelligence, Artificial Intelligence / Machine Learning and Streaming Data. With these technologies, Novon can help you visualise any data you need to get you ahead of your competition. Let’s have a quick look at the technologies and what they can do for you.
Data visualisation is playing an increasingly important role in identifying how we can improve our communities, our businesses, and our world. Data visualisation empowers people to make better decisions faster, and if you look at the fastest-growing companies in the past 5 years, chances are they have adopted near real-time data visualisation technologies to help them respond to and exceed customer expectations.
Digital transformation efforts, particularly in the realm of data and analytics, were once the sole responsibility of the CIO and their team. Today that paradigm is changing, as data and analytics is more tightly woven into the business and its goals. Today all executives at the C-level are or should be committed to treating data and analytics as a shared responsibility, and functional leaders are expected to empower their employees with the data and the skills they need to do their jobs.
Data visualisation will continue to be an even more important component, especially as more operations and services move into the digital space. The potential impact of that data will only get stronger as increased automation, AI, and forecasting models help us better predict and prepare for what is ahead. Today, those who have taken the initiative to shift to a digital-first mindset, driven by data, are better prepared to satisfy and exceed customer expectations.
So, how do we best achieve this practically, and what technologies would best suit us to do this? The technologies that Novon uses from their toolkit are all forms of Analytics, Business Intelligence, Artificial Intelligence / Machine Learning and Streaming Data. With these technologies, Novon can help you visualise any data you need to get you ahead of your competition. Let’s have a quick look at the technologies and what they can do for you.
All analytic solutions can be provisioned as batch or, as near real-time data solutions.
Imagine the sales report you get on your desktop every morning now being updated every minute or faster. How about every second! The only technology change required to achieve this is how we feed the data to the analytics or BI engine. It could be a set of APIs, or a backbone SOA set of services or now in 2020, it could be a streaming data platform that will provide this up to minute status of how your organisation is performing. For Novon when helping organisations choose the right platform, we focus not only on the platform, but what you want to see and when you want to see it. This approach adds a richness to BI and Analytics solutions that has been lacking for the past few years and indeed this richness is driving the phenomenal growth and use of Analytics in Australia!
Descriptive Analytics and BI Often called analysis, they offer very similar customer driven solutions. They measure and display what has happened and in various configurations why it happened. For example. a monthly sales report, profitability per product line or marketing campaign, market penetration of any given sector, etc. They give you insights on how a sector or project has performed. This is the most basic form of analytics.
Predictive / Prescriptive / Augmented Analytics
Today’s organisations are under more pressure than ever when it comes to keeping, expanding, and attracting new customers. As such, innovation is essential, which is where Predictive and Prescriptive analytics, not analysis, can really help.
Predictive and Prescriptive analytics involves the programmatic study of data to uncover potential trends, to investigate the effects of decisions or events, or to evaluate the performance of a given tool or scenario. As analytics has evolved from descriptive, to predictive and now to prescriptive, businesses will have to evolve as well and embrace new processes. By using an augmented analytic process, organisations can automatically find, visualise, and narrate potentially important data correlations. Often the best way to achieve this is to add AI and Machine Learning (ML) to your data analytics arsenal.
Augmented analytics platforms can take large amounts of data and organise and “clean” them to be usable for future analysis This process will speed up grouping and aligning the data, so predictions and patterns are more easily recognised and acted upon. Again, AI and ML can speed up and improve the accuracy of this process.
Imagine the sales report you get on your desktop every morning now being updated every minute or faster. How about every second! The only technology change required to achieve this is how we feed the data to the analytics or BI engine. It could be a set of APIs, or a backbone SOA set of services or now in 2020, it could be a streaming data platform that will provide this up to minute status of how your organisation is performing. For Novon when helping organisations choose the right platform, we focus not only on the platform, but what you want to see and when you want to see it. This approach adds a richness to BI and Analytics solutions that has been lacking for the past few years and indeed this richness is driving the phenomenal growth and use of Analytics in Australia!
Descriptive Analytics and BI Often called analysis, they offer very similar customer driven solutions. They measure and display what has happened and in various configurations why it happened. For example. a monthly sales report, profitability per product line or marketing campaign, market penetration of any given sector, etc. They give you insights on how a sector or project has performed. This is the most basic form of analytics.
Predictive / Prescriptive / Augmented Analytics
Today’s organisations are under more pressure than ever when it comes to keeping, expanding, and attracting new customers. As such, innovation is essential, which is where Predictive and Prescriptive analytics, not analysis, can really help.
Predictive and Prescriptive analytics involves the programmatic study of data to uncover potential trends, to investigate the effects of decisions or events, or to evaluate the performance of a given tool or scenario. As analytics has evolved from descriptive, to predictive and now to prescriptive, businesses will have to evolve as well and embrace new processes. By using an augmented analytic process, organisations can automatically find, visualise, and narrate potentially important data correlations. Often the best way to achieve this is to add AI and Machine Learning (ML) to your data analytics arsenal.
Augmented analytics platforms can take large amounts of data and organise and “clean” them to be usable for future analysis This process will speed up grouping and aligning the data, so predictions and patterns are more easily recognised and acted upon. Again, AI and ML can speed up and improve the accuracy of this process.
Novon consider AI and ML as inseparable technologies, fitting together to achieve the ultimate accuracy in predictive and prescriptive analytics.
For Novon, one of the most important new tech development of this past decade has been the practical success of deep learning (popularly known as “artificial intelligence” or “AI”), the sophisticated statistical analysis of lots and lots of data. In the coming decade, data will continue to beget data, to break boundaries, to drive innovation and profits, and create new challenges and concerns, most often about the privacy and security of data.
The constant increase in data processing speeds and bandwidth, the nonstop invention of new tools for creating, sharing, and consuming data, and the steady addition of new data creators and consumers around the world, ensure that data growth continues unabated.
For most people in businesses, machine learning seems like rocket science, appearing expensive and talent demanding. But with the advent of cloud-as-a-service, just about any organisation can gain access to this sophisticated sector of the market for a low cost and minimal resource investment. Today, if you are new to data visualisation science you can kick-start an ML initiative without much investment, by adopting a cloud based solution which will enable you to grab the low-hanging fruit within your organisation and its market place.
Machine learning as a service (MLaaS) is an umbrella definition of various cloud-based platforms that cover most infrastructure issues such as data pre-processing, model training, and model evaluation with further prediction. Prediction results can be bridged with your internal IT infrastructure through APIs. Talk to us about who provides these services and how you might enable them in your organisation.
For Novon, one of the most important new tech development of this past decade has been the practical success of deep learning (popularly known as “artificial intelligence” or “AI”), the sophisticated statistical analysis of lots and lots of data. In the coming decade, data will continue to beget data, to break boundaries, to drive innovation and profits, and create new challenges and concerns, most often about the privacy and security of data.
The constant increase in data processing speeds and bandwidth, the nonstop invention of new tools for creating, sharing, and consuming data, and the steady addition of new data creators and consumers around the world, ensure that data growth continues unabated.
For most people in businesses, machine learning seems like rocket science, appearing expensive and talent demanding. But with the advent of cloud-as-a-service, just about any organisation can gain access to this sophisticated sector of the market for a low cost and minimal resource investment. Today, if you are new to data visualisation science you can kick-start an ML initiative without much investment, by adopting a cloud based solution which will enable you to grab the low-hanging fruit within your organisation and its market place.
Machine learning as a service (MLaaS) is an umbrella definition of various cloud-based platforms that cover most infrastructure issues such as data pre-processing, model training, and model evaluation with further prediction. Prediction results can be bridged with your internal IT infrastructure through APIs. Talk to us about who provides these services and how you might enable them in your organisation.
Streaming data could fundamentally change how companies architect data driven solutions. A traditional relational data architecture is very prescriptive. When you try to analyse or create insights in this paradigm, you have a lot of supporting systems and code. These systems typically connect to a source system to extract data, apply business rules or logic through transformations, load data into a data warehouse, load and aggregate data within a data mart for reporting and build a set of reports or structure to analyse the data.
This traditional approach can create some challenges such as stale / old data, after transformation the user may not know the source of the data, or if it has been changed during processing and the raw data may be difficult or impossible to retrieve to compare outcomes.
So, why would I use streaming data?
Well for one streaming data is faster data, so if your use case requires up to the minute use of your data then streaming data just might be the solution. Capturing data and making it available within an organisation quickly will be a differentiator for companies moving forward in today’s modern data architecture. For example, a customer can be filling out a logistics website order, and they run into an issue applying for the goods. They reach out for help by calling the customer service line. What if the customer service representative could know exactly what page the customer is on, what he or she was trying to do, and the specific error that is being displayed when the customer calls? By using streaming data that is possible!
One common misconception with streaming data is that all data needs to be delivered in near real-time. While that is possible a lot can be delivered in batch. Here are some of the main points to consider with streaming data: how much latency is acceptable on the new data, what volume of data are you working with and should the records be processed individually.
In the logistics example above, near real time data is required, but an insurance claims processing engine would be satisfied with a batching data process that runs say every ten minutes. Either way, the streaming paradigm is significantly faster than many current solutions, most of which are overnight.
Streaming Data can be more available: Another practice possible with a streaming architecture is to create a streaming hub. A streaming data hub supports sharing data across departments or lines of business and would significantly increase analytics and insight opportunities. Having a single view of a customer across all product offerings not only creates a streamlined experience for them, but it also allows you to better understand each customer’s product utilisation, behaviour patterns, and willingness to try new products. This approach is not unique to streaming data as real-time integration also provides for this architecture and in lots of cases just requires redirecting and repurposing to achieve this, 360 view of your customer. Data dissemination also instils the concepts of data producers and data consumers within the organisation.
Streaming Data is More Flexible:In today’s agile environment, flexibility is a key-iterative development, experimentation, and failing fast is the norm. Many leading organisations are realising the benefits of data experimentation. The streaming paradigm is central to data experimentation methodology by serving data rapidly to support prototyping and delivering insights quickly. Ultimately, streaming data offer benefits that can fundamentally change data processing for many organisations. Much like a cloud data migration, streaming data is a journey that impacts technical and business users from all departments and needs to be treated as a core data asset.
The Architecture:There are a couple of tried and true architectural approaches that have been eagerly adopted by the market; The Kappa and the Lambda architecture approaches. Novon believes they satisfy different use cases and apply either architecture depending on the use case of our customer.
This traditional approach can create some challenges such as stale / old data, after transformation the user may not know the source of the data, or if it has been changed during processing and the raw data may be difficult or impossible to retrieve to compare outcomes.
So, why would I use streaming data?
Well for one streaming data is faster data, so if your use case requires up to the minute use of your data then streaming data just might be the solution. Capturing data and making it available within an organisation quickly will be a differentiator for companies moving forward in today’s modern data architecture. For example, a customer can be filling out a logistics website order, and they run into an issue applying for the goods. They reach out for help by calling the customer service line. What if the customer service representative could know exactly what page the customer is on, what he or she was trying to do, and the specific error that is being displayed when the customer calls? By using streaming data that is possible!
One common misconception with streaming data is that all data needs to be delivered in near real-time. While that is possible a lot can be delivered in batch. Here are some of the main points to consider with streaming data: how much latency is acceptable on the new data, what volume of data are you working with and should the records be processed individually.
In the logistics example above, near real time data is required, but an insurance claims processing engine would be satisfied with a batching data process that runs say every ten minutes. Either way, the streaming paradigm is significantly faster than many current solutions, most of which are overnight.
Streaming Data can be more available: Another practice possible with a streaming architecture is to create a streaming hub. A streaming data hub supports sharing data across departments or lines of business and would significantly increase analytics and insight opportunities. Having a single view of a customer across all product offerings not only creates a streamlined experience for them, but it also allows you to better understand each customer’s product utilisation, behaviour patterns, and willingness to try new products. This approach is not unique to streaming data as real-time integration also provides for this architecture and in lots of cases just requires redirecting and repurposing to achieve this, 360 view of your customer. Data dissemination also instils the concepts of data producers and data consumers within the organisation.
Streaming Data is More Flexible:In today’s agile environment, flexibility is a key-iterative development, experimentation, and failing fast is the norm. Many leading organisations are realising the benefits of data experimentation. The streaming paradigm is central to data experimentation methodology by serving data rapidly to support prototyping and delivering insights quickly. Ultimately, streaming data offer benefits that can fundamentally change data processing for many organisations. Much like a cloud data migration, streaming data is a journey that impacts technical and business users from all departments and needs to be treated as a core data asset.
The Architecture:There are a couple of tried and true architectural approaches that have been eagerly adopted by the market; The Kappa and the Lambda architecture approaches. Novon believes they satisfy different use cases and apply either architecture depending on the use case of our customer.