Data Handling: Acquire, Organise, Analyse, Deliver

Data Handling: Acquire, Organise, Analyse, Deliver

Data Accessibility
Data Analysis
Data Ownership
Database Structures
Data Warehouse
April 13, 2024
Stephanie Wiechers
There are four steps in the process of generating insights from data. First, it’s doing the actual work and logging corresponding data. This data is then organised, combining data sources if necessary, and creating a format that is easy to work with. The third step is analysis, which can be anything from summing the data, visualising it, or performing big data analysis. The final step is delivery, the moment the output is used - for example, an interactive dashboard with company metrics, or automated emails showing the financials and unusual numbers of the past month.
This article will explain what is done in each of these steps in detail, including examples of what that looks like for different types of businesses.
Remember - we use AI for data extraction & organisation, but we also employ humans (some of them enjoy drawing)
Remember - we use AI for data extraction & organisation, but we also employ humans (some of them enjoy drawing)


Data acquisition is essentially the process of determining what aspects of your business operations need to be logged and how to go about it. The heart of this process is the actual operation of the business, involving the people working and the money flowing in and out. These operations are documented, either digitally or automatically, through various means. For instance, machinery could be equipped with sensors that automatically log data about its operation. Sales made through an online platform such as online ticket sales are automatically logged into the system. On the other hand, data can also be manually logged into a system. This could involve logging interactions with customers into a CRM system, logging service requests into an ERP system, or manually checking in on operations and documenting the status. In essence, data acquisition is about capturing the reality of your business operations in a structured and accessible format.

Data Acquisition Industry Examples

Below are some examples of data and means of generating data (=operation + medium chosen) per industry. Note that writing down information on a piece of paper is also data, just not digital. In general, next to systems, excel sheets are also a popular means of storing the acquired data.


  • ERP system(s) (give ERP examples, Ridder, SAP, add 3 other manufacturing ERPs)
  • Direct logging
  • Accounting software

Financial Services

  • Project management
  • Internal system data
  • Deal data


  • POS data
  • Ticket sales & Event management
  • HR data

Environmental Services

  • Operational data
  • Accounting statements


Step one: connect. Before organising data, it's essential to create links between the logged data source and the database/platform/lake. This is often done through API calls and then (where desired) smartly linking data sources. The goal is to allow for efficient data analysis and easy extraction of insights. Moreover, it enhances data consistency and integrity, making sure that the data used across different platforms or systems is uniform and up-to-date. It can also help to identify problems with data integrity.
Organising data involves various methods and techniques that depend on the nature of the data, the kind of insights desired, and the tools available. It can be stored in structured databases like SQL, where data is stored in tables and can be easily queried. For unstructured or semi-structured data, NoSQL databases like MongoDB or data lakes can be used. Cloud storage solutions can also be used for storing and organising data, offering scalability and remote access. Data can be organised in a hierarchical, network, relational, or object-oriented model, depending on its relationships and use. Furthermore, data can be distributed in different environments - on-premise, cloud, or hybrid. The use of data integration tools can also help in gathering and combining data from different sources, and data virtualisation can provide a way to aggregate data from various sources without needing to move it.
  1. Structured or relational data platforms which include:
      • Physical data warehouse: large store of data collected from a wide range of sources.
      • ODS (Operational Data Store): database designed to integrate data from multiple sources for additional operations on the data.
      • MDM (Master Data Management): method used to define and manage the critical data of an organization to provide a single point of reference.
      • Data hub: collection of data from various sources organized for distribution, sharing, and often subsetting and sharing.
      • Marts: A subset of a data warehouse that can provide data for reporting and analysis on a section, unit or a particular subject area.
  1. Nonrelational platforms which include data lakes: system or repository of data stored in its natural/raw format, usually object blobs or files.
  1. Virtualisation: a virtual version of the resource (data), such as a server, storage device, network or even an operating system where the framework divides the data into one or more execution environments.

Organise: an Explanation in Layman Terms

Data organisation is best compared to a library. Just as a library categorises books by genre, author, and title, data should be categorised based on its nature and use. It is then stored in different 'shelves' or databases. Some databases are structured like tables (similar to Excel sheets), which are great for data that fits into specific categories. Others are more flexible and can handle data that doesn't fit neatly into table cells.
Think of these databases as different rooms in the library, each holding different types of books. Some rooms might have novels (structured data), while others might have collections of personal letters (unstructured data).
Data is also often stored in different locations based on the company's needs. It's like having several branches of the library in different parts of the city. Some data might be stored on the company's own servers (on-premise), some might be stored in the cloud (like storing books in a shared public library), and some might be stored in a combination of the two (hybrid).
Finally, companies often use special tools to bring all this data together when they need it. This is like having a librarian who can quickly fetch books from any room or branch of the library. These tools help companies view and analyse their data, no matter where it's stored - preventing endless searching through folders, excel sheets and documents.


The next step in the data handling process is analysis. This involves processing the organized data to extract meaningful insights. This step requires a deep understanding of the data and the tools used.


Depending on the complexity of your data, the way it is organised and the insights you need, you might be able to prepare and analyse the data yourself or you might need to use a dedicated tool or service. If the initial database or warehouse organisation is done right, it will save a lot of time on analysis, as it will be easier and faster to extract the right data, integrity checks are automatically performed where possible, and there is great clarity on the exact data available.
When doing data extraction yourself, you can use SQL queries to extract specific data from your databases or data warehouse. If your data is well-structured and organised in a smart warehouse, extracting the right data should be relatively straightforward. You could even do a quick .csv or Excel extract if you want it quick and dirty!


The actual processing of the data can take many forms, depending on what you want to achieve.
  • Analytics for reporting: use your data to create reports on company metrics and KPIs, make forecasts, and discover anomalies. This type of analysis is often used for regular reporting and monitoring.
  • Advanced analytics: in-depth type of analysis that can involve big data analysis, AI, and training and testing of machine learning models. Use cases for advanced analytics can include predictive maintenance and the development of tools that can learn and automate certain tasks.
Remember, the goal of data analysis is not just to crunch numbers, but to extract meaningful insights that can help guide your business decisions.


The final step in the data handling process is delivery. This is when the insights derived from the analysis are presented in a way that can be easily understood and utilized by the relevant parties. This can be done through various means:
  • Visuals: graphs, charts, and infographics are visualise data and make it easier to understand at a glance. They can be particularly useful when dealing with large amounts of data or complex relationships. Obviously, it’s still extremely important to create the right insights - otherwise it’s just a pretty graph.
  • BI dashboards: business Intelligence tools like MS PowerBI, Looker studio, Qlik, Metabase handle these visualisations in real time. Used to track KPIs, identify trends and anomalies, and provide actionable insights.
  • Automated email reports: generated from the data analysis which can be set up to be sent out automatically to relevant stakeholders/departments. They can be used to provide regular updates on key metrics or to alert the team when there are significant changes in the data. The great advantage here is that there is no hassle with login codes and no extra effort or habit that needs to be built.
  • Natural language output: this involves using AI to translate data insights into written text. It can be particularly useful for delivering insights in a way that is easy to understand for individuals who are not data experts.
To summarise, there are many formats. The output of data analysis can be delivered in various formats such as email, excel, powerpoint, or (online or emailed) dashboards. The format chosen will depend on the needs and preferences of the end users.

What we haven’t discussed

We haven't gotten into the details of data quality, management of systems, GDPR, or the importance of having clear ownership in an organization yet - but don't worry, we've got you covered! You can find more about these topics in our other blog posts, so feel free to check them out. And if the answer is not there, let us know.
What Pearstop does in this proces is making sure all that beautifully logged data is ready for use. We create the right connections, make sure all communication steps (see image) are set up, and the data is structured & easy to use. Our software and our people are great at handling anything in the organisation and analysis categories, but also have ample experience in virtually any BI tool available (hey, we wanted insights in our own data, too).


To summarise (yes, it says conclusion, not summary), getting (the right) insights from your companies daily operation can be an easy process if it’s been set up right, or a very time-consuming one that requires creating the same analysis over and over again. Doing the work of a proper data setup once allows you to automate the bulk of the work, leaving you time to do your actual work.
PFIEW! We love geeking with our tools but understand that these things can be a bit dry if it’s not your game. Anyhow — we think it’s fun and have smart software and smart people who love setting these things up in a smart (and for you, easy way). Tell me about your annoyances with handling your data (

We publish a bimonthly newsletter. Get the latest industry applications for data analytics, data engineering, data science and AI.