A framework used for storing and processing huge structured and semi-structured datasets (up to petabytes of data). Thanks to clustering (when several computers are clustered to process data in parallel), processing large amounts of information with Hadoop is fast and easy.
A platform that allows developing, testing, shipping, and deploying software faster and more consistently. It allows packaging and running apps in so-called containers. Because Docker helps reduce the time between coding and production, it’s one of the primary tools of DevOps.
The most popular software version control system. Git is used in projects of different sizes (including one-person projects), but it’s most useful for distributed teams that cooperate on a single project. It supports thousands of branches simultaneously and data integrity.
A high-level deep learning library for working with TensorFlow 2. Developed by Google, Keras is a developer-friendly library that makes implementing neural networks painless. It has a Python frontend (not the same as frontend in web, more like a way to interact) and numerous backend options (again, not the same as web backend), allows quick prototyping, and has a large user base.
A container orchestration system (you can control where containers run) that allows scaling, deploying, and managing cloud-native applications automatically. Originally developed by Google, Kubernetes is compatible with Docker, Containerd, and CRI-O.
.NET is a platform mostly used for building and running C# applications. There are several implementations of .NET:
An ML (deep learning) framework used for computer vision and NLP applications (including autonomous driving). PyTorch can boast dynamic computational graphs, a comprehensive ecosystem of tools and libraries, and speed, helping move the process from prototyping to production faster. It is used with Python.