Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Jan 8:2:8.
doi: 10.3389/neuro.11.008.2008. eCollection 2008.

Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework

Affiliations

Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework

Tiziano Zito et al. Front Neuroinform. .

Abstract

Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

Keywords: Modular toolkit for Data Processing; Python; computational neuroscience; machine learning.

PubMed Disclaimer

Figures

Figure 1
Figure 1
A simple denoising application.
Figure 2
Figure 2
Definition of a new node that removes the mean of the signal.
Figure 3
Figure 3
Example of feed-forward network topology.
Figure 4
Figure 4
Python code to reproduce the results in Wiskott (2003).
Figure 5
Figure 5
Chaotic time series generated by the logistic equation.
Figure 6
Figure 6
The real driving force and the driving force as estimated by SFA.

References

    1. Berkes P. (2006). Temporal Slowness as an Unsupervised Learning Principle. Ph.D. Thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, http://edoc.hu-berlin.de/docviews/abstract.php?id = 26704
    1. Bishop C. M. (1995). Neural Networks for Pattern Recognition. New York, NY, Oxford University Press
    1. Bishop C. M. (2007). Pattern Recognition and Machine Learning. New York, NY, Springer-Verlag
    1. Blaschke T., Wiskott L. (2004). CuBICA: independent component analysis by simultaneous third- and fourth-order cumulant diagonalization. IEEE Trans. Signal Process. 52, 1250–125610.1109/TSP.2004.826173 - DOI
    1. Blaschke T., Zito T., Wiskott L. (2007). Independent slow feature analysis and nonlinear blind source separation. Neural Comput. 19, 994–102110.1162/neco.2007.19.4.994 - DOI

LinkOut - more resources