Learn about the key infrastructure skills required to excel when working with data and deploy anything using Docker!
During this workshop, we’ll explore how Docker can help you:
– easily create and deploy complex Data Science environments on your own workstation or on cloud using Docker and Compose.
– build development environments with Anaconda, Python, R and Jupyter Notebook and connect them to PostgreSQL, Redis, MongoDB, CouchDB and more using Docker Compose. Setup ETL pipelines with Zeppelin Notebook and Spark.
– build geospatial engineering environments with QGIS, Jupyter, PostgreSQL, PostGIS, PGAdmin4 and MongoDB.
– publish your specifications to GitHub and your images to Docker Hub or a private image registry and setup a Continous Integration pipeline.
– deploy to Linux/s390x and other exotic platforms.
– get started with Data Science, Big Data or Containers and explore free courses and badges / certifications.
Mihai Criveti will lead us through this workshop. Mihai is the Cloud Native Competency leader at IBM, where he builds solutions for Cloud, AI, Big Data, Containers and more 🙂 More about Mihai here https://www.linkedin.com/in/crivetimihai