In the past decade machine learning and especially deep learning has become ridiculously accurate. How? Algorithms have become better, that’s for sure, but also now it is easier to come across powerful hardware that can process large amounts of data fairly quickly. This causes a few problems though: It costs a lot of money, so it hurts your wallet, but it also means that people who don’t have access to extremely powerful data centers and hardware are not able to participate in the innovation and application of the latest and best models. Furthermore, it requires tremendous amounts of resources to run this infrastructure, which has a negative impact on the environment. This presentation aims to explain some theory and history behind this, and also methods you can apply to reduce your carbon footprint, save money, and build models based on efficiency and not only accuracy.
Top rated books
- Girl Decoded: A Scientist's Quest to Reclaim Our Humanity by Bringing Emotional Intelligence to Technology
- Don't Trust Your Gut: Using Data to Get What You Really Want in Life
- Data Science and Big Data Analytics: Discovering, Analyzing, Visualizing and Presenting Data
- Our Final Invention: Artificial Intelligence and the End of the Human Era
- Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems
Turn on the "highlight" option for any widget, to get an alternative styling like this. You can change the colors for highlighted widgets in the theme options. See more examples below.
Categories count color
Collaboratively harness market-driven processes whereas resource-leveling internal or "organic" sources.