In the past decade machine learning and especially deep learning has become ridiculously accurate. How? Algorithms have become better, that’s for sure, but also now it is easier to come across powerful hardware that can process large amounts of data fairly quickly. This causes a few problems though: It costs a lot of money, so it hurts your wallet, but it also means that people who don’t have access to extremely powerful data centers and hardware are not able to participate in the innovation and application of the latest and best models. Furthermore, it requires tremendous amounts of resources to run this infrastructure, which has a negative impact on the environment. This presentation aims to explain some theory and history behind this, and also methods you can apply to reduce your carbon footprint, save money, and build models based on efficiency and not only accuracy.
You may also like
What Do You Include in Data Quality Issue Log?
When implementing a Data Governance Framework, one of the most critical components to include is a Data Quality Issue Resolution process.
Kickstart 2025 with Fresh Insights in Data & AI: First Edition Inside!
Welcome to the first edition of Hyperight.com Monthly Newsletter for 2025, where we dive into the latest trends, breakthroughs, and opportunities in data and AI.
AIAW Podcast 142 – Christmas Special – Season 9 Finale
Join us for the grand finale of AIAW Podcast’s Season 9 with Episode 142, 2024 in Review! A reflective journey through a year of remarkable advancements in AI.
Data Innovation Summit 2025
Early bird tickets ending in:
days
hours
minutes
seconds
SECURE YOUR TICKET NOW!
Add comment