Modern data-driven business infrastructure is not as effective as it should be when it comes to critical decision making. Eng-to-End Data pipelines are composed out of fundamentally diverse pieces of tech, each focusing on a specific frontend (e.g., DataFrames, Tensors, Streams) and running in total isolation, thus, being highly unoptimised and complex to integrate with event-based business logic. Our research group has been looking into ways we can use advanced systems theory to compile, optimise and execute distributed functions in unison across the whole spectrum of data-driven programming, leading to a unified way to combine analytics and services all the way down to hardware execution and make continuous intelligence a reality.
- Introducing the concept of Continuous Intelligence and why we are not there yet.
- Pinpointing weaknesses in the current way we structure data-driven pipelines today
- Explaining the potential of an Intermediate Representation (IR) and Shared Hardware Execution support to solve the problem.
- Presenting our vision on how this new tech can be used to radically change the way we declare and distill knowledge from data in a fast-changing world.