Python coder working for Red Hat Product Security team.Type hinting hands-on
A brief introduction to stronger typing features will be presented (including type hints and ABC) followed by a series of hands-on use cases designed to familiarize the audience with the use of the 'typing' stdlib module.
Open people's minds and open technologies for better geospatial world.
Open source, open data, open standards, open minded people.
Nearly everything we are doing is somehow related to some position and time. Geographical information systems (GIS) are computer systems that handle spatial (and temporal) data, helping us to analyze the data and understand their context, trying to predict future state.
In this workshop, we are going to go through the basics of GIS, spatial data representation (vector and raster), touching the topic of coordinate reference systems and introduce some basic concepts of GIS.
Python is the most geo-positive programming language nowadays, having bindings in most desktop, server programs, as well as core libraries - in this workshop, we are going to give brief introduction to Python geospatial programming and overview about possibilities how Python can be used together with various spatial tools.
Satellite 5 Senior quality engineer working in Red Hat Czech. Likes Bash and Python and likes when things are green, working and boring.Writing my first Ansible module
Maybe you are using Ansible but realized you are using "shell" module where something nicer could be used? We will go through creating a simple "yum-repo-manager" module.
Programmer at BUT Faculty of Mechanical Engineering,
game development hobbyist.
My experience with Python is ranging from web development to embedding interpreter into C++, GUI developement with PySide (Qt) and OpenGL graphics.
Hands on experience with SDL, PySide and OpenGL.
Typical problems often done wrong in games/apps:
- measure of time (sleep? better not);
- resolution (how to not think in pixels, but not forget they exist);
- shared code between tools and engine.
self-learned occasional pythonista, though mostly front end developer in MSD these days
Promotion that the
https://rosalind.info/problems/list-view site exists for self-study and solving a few of the problems together.
This can be done for 2 hours or the whole day, both for beginners (copy&pasting strings to python files) or more advanced (validating command line inputs, streaming potentially large files, wrapping solutions into maintainable modules or classes, computing multiple outputs from 1 input, etc.).
It would be great if attendees already have Anaconda with Python 3 installed and IPython Notebook working.
Python fan, meetup organizer, PyLadies mentor. Leads the Python Maintenance team at Red Hat.Turn On the Lights with MicroPython
Python can power websites, software systems, games, computations, and a lot more of what happens behind a computer screen. But can it leave the computer world and do something "real"?
MicroPython is a special version of Python, made to fit on devices small enough to fit in your hand and cheap enough to experiment with, without you worrying about damaging them.
With a couple of wires and electronic components, and a few lines of Python, you can turn motors, blink lights, measure temperatures or distances -- and eventually build all kinds of controllers, robots, toys, and other devices.
This workshop is a beginner-level introduction into the hardware world. We'll connect a microcontroller board to a computer and put on a colorful LED light show.
If you know variables and if/for/while statements, you know enough Python to participate.
You will need a computer with administrator access (to install the necessary drivers).
Python programmer, data fan, meetup organizer, PyLadies couch and hobby mobile iOS developerNew insights into data with Python
Each of us knows Instagram. But how to use Instagram to better product recommendations? It's always important to know what matters and what doesn’t. As a data scientist you can choose the right answer for many questions in various industries. Python is one of the most used programming languages in data science which supports many libraries that make your job easier so you can focus on your data.
In this workshop, we are going to extract meaning from a larger data set, perform data mining, design test data set, create visualization using matplotlib, apply recommendations of a products and learn best practices in cleaning and preparing your data.
We will be using iPython Notebook, Python 3 version. This workshop is for people with basic/intermediate knowledge of Python.
PhD student in network analysis and fan of... Scala. He still enjoys programming independently of language (but Perl). Active member of WarszawScala and OS contributor in various projects: nebulostore (Java), twitter/cassovary (Scala), palantir/typedjsonrpc (Python). Proud to be working @Growbots.PySpark on Google Cloud Dataproc: MapReduce job in Python on 20 machines in 20 minutes
Some people believe that processing gigabytes of data using a bunch of machines doesn’t come easy. In this workshop we will prove they are wrong. The plan is simple: set up Spark cluster in the cloud and run a few jobs on hundred-giga datasets.
Setting up Spark cluster with Hadoop File System is not an easy task if done from scratch. Google Cloud Platform enables one to set up a Hadoop cluster with Spark in a hassle-free manner. During the workshop we will show how to set everything up, make use of distributed processing tools provided by PySpark: RDDs and DataFrames to conclude with some fun exercises.
Apache Spark is a cluster computing engine that implements the MapReduce paradigm. We will quickly introduce it and go through its architecture.
In the first part of the training, we will see how to create and manage Spark cluster using Google Cloud Dataproc. We will focus on an environment for Python 3 and Jupyter Notebook.
In the second part, you will see how easy it is to write MapReduce jobs to process gigabytes of data on tens of machines in parallel.
Each participant will need a Google Cloud account with some dollars to spent (it's possible to get a demo account with 200$) and Dataproc service enabled. Laptop with Internet connection is a must too.