Monday
Jun
24
13:00 PM
17:00 PM
Room: 105

To get the most out of the tutorials, you will need to have the correct software installed and running. Specific requirements for each tutorial are specified in the detailed description for each tutorial. But it's best to start with one of the scientific Python distributions to ensure an environment that includes most of the packages you'll need.

IPython in depth

Fernando Perez - University of California, Berkeley
Brian Granger - Chronicle Labs


Videos


Part 1
[1:25:00]

Part 2
[1:34:54]

Part 3
[36:52]

Bio(s)

Fernando Perez
Fernando Perez received his PhD in theoretical physics from the University of Colorado and then worked on numerical algorithm development at the Applied Mathematics Dept. at the same university. He now works as a scientist at the Helen Wills Neuroscience Institute at the University of California, Berkeley, focusing on the development of new analysis methods for brain imaging problems and high-level scientific computing tools. Towards the end of his graduate studies, he became involved with the development of Python tools for scientific computing. He started the open source IPython project in 2001 when looking for a more efficient interactive workflow for everyday scientific tasks. He continues to lead the IPython project along with a growing team of talented developers. He also is a member of the core matplotlib development team, and has contributed to numpy, scipy, sympy, mayavi and other Python projects.

Brian Granger
Brian Granger is an Assistant Professor of Physics at Cal Poly San Luis Obispo and co-founder of Chronicle Labs (chronicle.io). He has a background in theoretical atomic, molecular and optical physics with a Ph.D. from the University of Colorado, Boulder. His current interests include quantum computing, parallel and distributed computing and interactive computing environments. He is a core developer of the IPython project, the creator of PyZMQ, and a contributor to SymPy. He has been a speaker and tutorial presenter at numerous computing conferences including PyCon, SciPy and Strata.


Description

IPython provides tools for interactive and parallel computing that are widely used in scientific computing, but can benefit any Python developer.

We will show how to use IPython in different ways, as: an interactive shell, an embedded shell, a graphical console, a network-aware VM in GUIs, a web-based notebook with code, graphics and rich HTML, and a high-level framework for parallel computing.


Outline

IPython started in 2001 simply as a better interactive Python shell. Over the last decade it has grown into a powerful set of interlocking tools that maximize developer productivity in Python while working interactively.

Today, IPython consists of a kernel that executes the user code and provides many features for introspection and namespace manipulation, and tools to control this kernel either in-process or out-of-process thanks to a well-specified communications protocol implemented over ZeroMQ. This architecture allows the core features to be accessed via a variety of clients, each providing unique functionality tuned to a specific use case:

An interactive, terminal-based shell with capabilities beyond the default Python interactive interpreter (this is the classic application opened by the ipython command that most users are familiar with).

A Qt console that provides the look and feel of a terminal, but adds support for inline figures, graphical calltips, a persistent session that can survive crashes of the kernel process, and more. A user-based review of some of these features can be found here.

A web-based notebook that can execute code and also contain rich text and figures, mathematical equations and arbitrary HTML. This notebook presents a document-like view with cells where code is executed but that can be edited in-place, reordered, mixed with explanatory text and figures, etc. The notebook provides an interactive experience that combines live code and results with literate documentation and the rich media that modern browsers can display.

A high-performance, low-latency system for parallel computing that supports the control of a cluster of IPython engines communicating over ZeroMQ, with optimizations that minimize unnecessary copying of large objects (especially numpy arrays). These engines can be controlled interactively while developing and doing exploratory work, or can run in batch mode either on a local machine or in a large cluster/supercomputing environment via a batch scheduler.

In this hands-on, in-depth tutorial, we will briefly describe IPython's architecture and will then show how to use the above tools for a highly productive workflow in Python.


Required Packages

  • EPD OR
  • Anaconda OR
  • Linux packages for IPython, NumPy, Matplotlib, SymPy

See http://ipython.org/install.html for further installation details.

IPython version 0.13.1 or higher will be required.


Documentation

A GitHub repo with our tutorial materials:

https://github.com/ipython/ipython-in-depth