Welcome to the documentation of hybridLFPy!¶
Module hybridLFPy¶
Python module implementating a hybrid model scheme for predictions of extracellular potentials (local field potentials, LFPs) of spiking neuron network simulations.
Development¶
The module hybridLFPy was mainly developed in the Computational Neuroscience Group (http://compneuro.umb.no), Department of Mathemathical Sciences and Technology (http://www.nmbu.no/imt), at the Norwegian University of Life Sciences (http://www.nmbu.no), Aas, Norway, in collaboration with Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Juelich Research Centre and JARA, Juelich, Germany (http://www.fz-juelich.de/inm/inm-6/EN/).
Manuscript¶
A preprint of our manuscript on the hybrid scheme implemented in hybridLFPy
is available on arXiv.org at http://arxiv.org/abs/1511.01681
Citation: Espen Hagen, David Dahmen, Maria L. Stavrinou, Henrik Linden, Tom Tetzlaff, Sacha Jennifer van Albada, Sonja Gruen, Markus Diesmann, Gaute T. Einevoll. Hybrid scheme for modeling local field potentials from point-neuron networks. arXiv:1511.01681 [q-bio.NC]
Bibtex source:
@ARTICLE{2015arXiv151101681H,
author = {{Hagen}, E. and {Dahmen}, D. and {Stavrinou}, M.~L. and {Lind{\'e}n}, H. and
{Tetzlaff}, T. and {van Albada}, S.~J. and {Gr{\"u}n}, S. and
{Diesmann}, M. and {Einevoll}, G.~T.},
title = "{Hybrid scheme for modeling local field potentials from point-neuron networks}",
journal = {ArXiv e-prints},
archivePrefix = "arXiv",
eprint = {1511.01681},
primaryClass = "q-bio.NC",
keywords = {Quantitative Biology - Neurons and Cognition},
year = 2015,
month = nov,
adsurl = {http://adsabs.harvard.edu/abs/2015arXiv151101681H},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
Tutorial slides¶
Slides from OCNS 2015 meeting tutorial T2: Modeling and analysis of extracellular potentials hosted in Prague, Czech Republic on LFPy and hybridLFPy: CNS2015_LFPy_tutorial.pdf
License¶
This software is released under the General Public License (see LICENSE file).
Warranty¶
This software comes without any form of warranty.
Installation¶
First download all the hybridLFPy
source files using git
(http://git-scm.com). Open a terminal window and type:
cd $HOME/where/to/put/hybridLFPy
git clone https://github.com/INM-6/hybridLFPy.git
To use hybridLFPy
from any working folder without installing files, add this
path to $PYTHONPATH
. Edit your .bash_profile
or similar file, and add:
export $PYTHONPATH=$PYTHONPATH:/PATH/TO/THIS/FOLDER:
Installing it is also possible, but not recommended as things might change with any pull request from the repository:
(sudo) python setup.py install (--user)
examples folder¶
Some example script(s) on how to use this module
docs folder¶
Source files for autogenerated documentation using Sphinx.
To compile documentation source files in this directory using sphinx, use:
sphinx-build -b html docs documentation
Online documentation¶
The sphinx-generated html documentation can be accessed at http://INM-6.github.io/hybridLFPy
Notes on performance¶
The present version of hybridLFPy
may facilitate on a trivial parallelism
as the contribution of each single-cell LFP can be computed independently.
However, this does not imply that the present implementation code is highly
optimized for speed. In particular, initializing the multicompartment neuron
populations do not as much benefit from increasing the MPI pool size, as
exemplified by a benchmark based on the Brunel-network example scaled up to
50,000 neurons and with simplified neuron morphologies.
Module hybridLFPy
¶
hybridLFPy¶
Provides methods for estimating extracellular potentials of simplified spiking neuron network models.
How to use the documentation¶
- Documentation is available in two forms:
Docstrings provided with the code, e.g., as hybridLFPy? within IPython
Autogenerated sphinx-built output, compiled issuing:
sphinx-build -b html docs documentation
in the root folder of the package sources
Available classes¶
CachedNetwork
- Offline interface between network spike events and used by class Population
CachedNoiseNetwork
- Creation of Poisson spiketrains of a putative network model, interfaces class Population
CachedFixedSpikesNetwork
- Creation of identical spiketrains per population of a putative network model, interface to class Population
GDF
- Class using sqlite to efficiently store and enquire large amounts of spike output data, used by Cached*Network
PopulationSuper
- Parent class setting up a base population of multicompartment neurons
Population
- Daughter of PopulationSuper, using CachedNetwork spike events as synapse activation times with layer and cell type connection specificity
PostProcess
- Methods for processing output of multiple instances of class Population
Available utilities¶
csd
- Ground truth current-source density estimation from multicompartment models
helpers
- Various methods used throughout simulations
setup_file_dest
- Setup destination folders of simulation output files
test
- Run a series of unit tests
class CachedNetwork
¶
-
class
hybridLFPy.
CachedNetwork
(simtime=1000.0, dt=0.1, spike_output_path='spike_output_path', label='spikes', ext='gdf', GIDs={'EX': [1, 400], 'IN': [401, 100]}, autocollect=True, cmap='Set1')[source]¶ Bases:
object
Offline processing and storing of network spike events, used by other class objects in the package hybridLFPy.
Parameters: simtime : float
Simulation duration.
dt : float,
Simulation timestep size.
spike_output_path : str
Path to gdf-files with spikes.
label : str
Prefix of spiking gdf-files.
ext : str
File extension of gdf-files.
GIDs : dict
dictionary keys are population names and item a list with first GID in population and population size
autocollect : bool
If True, class init will process gdf files.
cmap : str
Name of colormap, must be in dir(plt.cm).
Returns: hybridLFPy.cachednetworks.CachedNetwork object
See also
Methods
collect_gdf
()Collect the gdf-files from network sim in folder spike_output_path into sqlite database, using the GDF-class. get_xy
(xlim[, fraction])Get pairs of node units and spike trains on specific time interval. plot_f_rate
(ax, X, i, xlim, x, y[, binsize, ...])Plot network firing rate plot in subplot object. plot_raster
(ax, xlim, x, y[, pop_names, ...])Plot network raster plot in subplot object. raster_plots
([xlim, markersize, alpha, marker])Pretty plot of the spiking output of each population as raster and rate. -
collect_gdf
()[source]¶ Collect the gdf-files from network sim in folder spike_output_path into sqlite database, using the GDF-class.
Parameters: None Returns: None
-
get_xy
(xlim, fraction=1.0)[source]¶ Get pairs of node units and spike trains on specific time interval.
Parameters: xlim : list of floats
Spike time interval, e.g., [0., 1000.].
fraction : float in [0, 1.]
If less than one, sample a fraction of nodes in random order.
Returns: x : dict
In x key-value entries are population name and neuron spike times.
y : dict
Where in y key-value entries are population name and neuron gid number.
-
plot_f_rate
(ax, X, i, xlim, x, y, binsize=1, yscale='linear', plottype='fill_between', show_label=False, rasterized=False)[source]¶ Plot network firing rate plot in subplot object.
Parameters: ax : matplotlib.axes.AxesSubplot object.
X : str
Population name.
i : int
Population index in class attribute X.
xlim : list of floats
Spike time interval, e.g., [0., 1000.].
x : dict
Key-value entries are population name and neuron spike times.
y : dict
Key-value entries are population name and neuron gid number.
yscale : ‘str’
Linear, log, or symlog y-axes in rate plot.
plottype : str
plot type string in [‘fill_between’, ‘bar’]
show_label : bool
whether or not to show labels
Returns: None
-
plot_raster
(ax, xlim, x, y, pop_names=False, markersize=20.0, alpha=1.0, legend=True, marker='o', rasterized=True)[source]¶ Plot network raster plot in subplot object.
Parameters: ax : matplotlib.axes.AxesSubplot object
plot axes
xlim : list
List of floats. Spike time interval, e.g., [0., 1000.].
x : dict
Key-value entries are population name and neuron spike times.
y : dict
Key-value entries are population name and neuron gid number.
pop_names: bool
If True, show population names on yaxis instead of gid number.
markersize : float
raster plot marker size
alpha : float in [0, 1]
transparency of marker
legend : bool
Switch on axes legends.
marker : str
marker symbol for matplotlib.pyplot.plot
rasterized : bool
if True, the scatter plot will be treated as a bitmap embedded in pdf file output
Returns: None
-
raster_plots
(xlim=[0, 1000], markersize=1, alpha=1.0, marker='o')[source]¶ Pretty plot of the spiking output of each population as raster and rate.
Parameters: xlim : list
List of floats. Spike time interval, e.g., [0., 1000.].
markersize : float
marker size for plot, see matplotlib.pyplot.plot
alpha : float
transparency for markers, see matplotlib.pyplot.plot
marker :
A valid marker style
Returns: fig : matplotlib.figure.Figure object
-
class CachedNoiseNetwork
¶
-
class
hybridLFPy.
CachedNoiseNetwork
(frate=[(200.0, 15.0, 210.0), 0.992, 3.027, 4.339, 5.962, 7.628, 8.669, 1.118, 7.859], autocollect=False, **kwargs)[source]¶ Bases:
hybridLFPy.cachednetworks.CachedNetwork
Subclass of CachedNetwork.
Use Nest to generate N_X poisson-generators each with rate frate, and record every vector, and create database with spikes.
Parameters: frate : list
Rate of each layer, may be tuple (onset, rate, offset)
autocollect : bool
whether or not to automatically gather gdf file output
**kwargs : see parent class hybridLFPy.cachednetworks.CachedNetwork
Returns: hybridLFPy.cachednetworks.CachedNoiseNetwork object
See also
Methods
collect_gdf
()Collect the gdf-files from network sim in folder spike_output_path into sqlite database, using the GDF-class. get_xy
(xlim[, fraction])Get pairs of node units and spike trains on specific time interval. plot_f_rate
(ax, X, i, xlim, x, y[, binsize, ...])Plot network firing rate plot in subplot object. plot_raster
(ax, xlim, x, y[, pop_names, ...])Plot network raster plot in subplot object. raster_plots
([xlim, markersize, alpha, marker])Pretty plot of the spiking output of each population as raster and rate. -
spikes
= None¶ Create independent poisson spike trains with the some rate, but each layer population should really have different rates.
-
class CachedFixedSpikesNetwork
¶
-
class
hybridLFPy.
CachedFixedSpikesNetwork
(activationtimes=[200, 300, 400, 500, 600, 700, 800, 900, 1000], autocollect=False, **kwargs)[source]¶ Bases:
hybridLFPy.cachednetworks.CachedNetwork
Subclass of CachedNetwork.
Fake nest output, where each cell in a subpopulation spike simultaneously, and each subpopulation is activated at times given in kwarg activationtimes.
Parameters: activationtimes : list of floats
Each entry set spike times of all cells in each population
autocollect : bool
whether or not to automatically gather gdf file output
**kwargs : see parent class hybridLFPy.cachednetworks.CachedNetwork
Returns: hybridLFPy.cachednetworks.CachedFixedSpikesNetwork object
See also
Methods
collect_gdf
()Collect the gdf-files from network sim in folder spike_output_path into sqlite database, using the GDF-class. get_xy
(xlim[, fraction])Get pairs of node units and spike trains on specific time interval. plot_f_rate
(ax, X, i, xlim, x, y[, binsize, ...])Plot network firing rate plot in subplot object. plot_raster
(ax, xlim, x, y[, pop_names, ...])Plot network raster plot in subplot object. raster_plots
([xlim, markersize, alpha, marker])Pretty plot of the spiking output of each population as raster and rate.
class PopulationSuper
¶
-
class
hybridLFPy.
PopulationSuper
(cellParams={'verbose': False, 'cm': 1.0, 'nsegs_method': 'lambda_f', 'tstartms': 0, 'timeres_python': 0.1, 'Ra': 150, 'morphology': 'morphologies/ex.hoc', 'timeres_NEURON': 0.1, 'v_init': 0.0, 'lambda_f': 100, 'e_pas': 0.0, 'rm': 20000.0, 'tstopms': 1000.0}, rand_rot_axis=[], simulationParams={}, populationParams={'z_min': -450, 'z_max': -350, 'min_cell_interdist': 1.0, 'radius': 100, 'number': 400}, y='EX', layerBoundaries=[[0.0, -300], [-300, -500]], electrodeParams={'r_z': [[-1e+199, -600, -550, 1e+99], [0, 0, 10, 10]], 'r': 5, 'z': [-0.0, -100.0, -200.0, -300.0, -400.0, -500.0], 'seedvalue': None, 'y': [0, 0, 0, 0, 0, 0], 'x': [0, 0, 0, 0, 0, 0], 'n': 20, 'sigma': 0.3, 'method': 'som_as_point', 'N': [[1, 0, 0], [1, 0, 0], [1, 0, 0], [1, 0, 0], [1, 0, 0], [1, 0, 0]]}, savelist=['somapos', 'x', 'y', 'z', 'LFP', 'CSD'], savefolder='simulation_output_example_brunel', calculateCSD=True, dt_output=1.0, recordSingleContribFrac=0, POPULATIONSEED=123456, verbose=False, output_file='{}_population_{}')[source]¶ Bases:
object
Main population class object, let one set up simulations, execute, and compile the results. This class is suitable for subclassing for custom cell simulation procedures, inherit things like gathering of data written to disk.
Note that PopulationSuper.cellsim do not have any stimuli, its main purpose is to gather common methods for inherited Population objects.
Parameters: cellParams : dict
Parameters for class LFPy.Cell
rand_rot_axis : list
Axis of which to randomly rotate morphs.
simulationParams : dict
Additional args for LFPy.Cell.simulate().
populationParams : dict
Constraints for population and cell number.
y : str
Population identifier string.
layerBoundaries : list of lists
Each element is a list setting upper and lower layer boundary (floats)
electrodeParams : dict
parameters for class LFPy.RecExtElectrode
savelist : list
LFPy.Cell arguments to save for each single-cell simulation.
savefolder : str
path to where simulation results are stored.
calculateCSD : bool
Exctract laminar CSD from transmembrane currents
dt_output : float
Time resolution of output, e.g., LFP, CSD etc.
recordSingleContribFrac : float
fraction in [0, 1] of individual neurons in population which output will be stored
POPULATIONSEED : int/float
Random seed for population, for positions etc.
verbose : bool
Verbosity flag.
output_file : str
formattable string for population output, e.g., ‘{}_population_{}’
Returns: hybridLFPy.population.PopulationSuper object
See also
Population
,LFPy.Cell
,LFPy.RecExtElectrode
Methods
calc_min_cell_interdist
(x, y, z)Calculate cell interdistance from input coordinates. calc_signal_sum
([measure])Superimpose each cell’s contribution to the compound population signal, cellsim
(cellindex[, return_just_cell])Single-cell LFPy.Cell simulation without any stimulus, mostly for collectSingleContribs
([measure])Collect single cell data and save them to HDF5 file. collect_data
()Collect LFPs, CSDs and soma traces from each simulated population, and save to file. draw_rand_pos
(radius, z_min, z_max[, min_r, ...])Draw some random location within radius, z_min, z_max, and constrained by min_r and the minimum cell interdistance. run
()Distribute individual cell simulations across ranks. set_pop_soma_pos
()Set pop_soma_pos using draw_rand_pos(). set_rotations
()Append random z-axis rotations for each cell in population. -
calc_min_cell_interdist
(x, y, z)[source]¶ Calculate cell interdistance from input coordinates.
Parameters: x, y, z : numpy.ndarray
xyz-coordinates of each cell-body.
Returns: min_cell_interdist : np.nparray
For each cell-body center, the distance to nearest neighboring cell
-
calc_signal_sum
(measure='LFP')[source]¶ Superimpose each cell’s contribution to the compound population signal, i.e., the population CSD or LFP
Parameters: measure : str
{‘LFP’, ‘CSD’}: Either ‘LFP’ or ‘CSD’.
Returns: numpy.ndarray
The populations-specific compound signal.
-
cellsim
(cellindex, return_just_cell=False)[source]¶ Single-cell LFPy.Cell simulation without any stimulus, mostly for reference, as no stimulus is added
Parameters: cellindex : int
cell index between 0 and POPULATION_SIZE-1.
return_just_cell : bool
If True, return only the LFPy.Cell object if False, run full simulation, return None.
Returns: None
if `return_just_cell is False
cell : LFPy.Cell instance
if return_just_cell is True
See also
LFPy.Cell
,LFPy.Synapse
,LFPy.RecExtElectrode
-
collectSingleContribs
(measure='LFP')[source]¶ Collect single cell data and save them to HDF5 file. The function will also return signals generated by all cells
Parameters: measure : str
{‘LFP’, ‘CSD’}: Either ‘LFP’ or ‘CSD’.
Returns: numpy.ndarray
output of all neurons in population, axis 0 correspond to neuron ind
-
collect_data
()[source]¶ Collect LFPs, CSDs and soma traces from each simulated population, and save to file.
Parameters: None Returns: None
-
draw_rand_pos
(radius, z_min, z_max, min_r=array([0]), min_cell_interdist=10.0, **args)[source]¶ Draw some random location within radius, z_min, z_max, and constrained by min_r and the minimum cell interdistance. Returned argument is a list of dicts [{‘xpos’, ‘ypos’, ‘zpos’},].
Parameters: radius : float
Radius of population.
z_min : float
Lower z-boundary of population.
z_max : float
Upper z-boundary of population.
min_r : numpy.ndarray
Minimum distance to center axis as function of z.
min_cell_interdist : float
Minimum cell to cell interdistance.
**args : keyword arguments
Additional inputs that is being ignored.
Returns: soma_pos : list
List of dicts of len population size where dict have keys xpos, ypos, zpos specifying xyz-coordinates of cell at list entry i.
-
run
()[source]¶ Distribute individual cell simulations across ranks.
This method takes no keyword arguments.
Parameters: None Returns: None
-
class Population
¶
-
class
hybridLFPy.
Population
(X=['EX', 'IN'], networkSim='hybridLFPy.cachednetworks.CachedNetwork', k_yXL=[[20, 0], [20, 10]], synParams={'EX': {'syntype': 'AlphaISyn', 'section': ['apic', 'dend']}, 'IN': {'syntype': 'AlphaISyn', 'section': ['dend']}}, synDelayLoc=[1.5, 1.5], synDelayScale=[None, None], J_yX=[0.20680155243678455, -1.2408093146207075], tau_yX=[0.5, 0.5], **kwargs)[source]¶ Bases:
hybridLFPy.population.PopulationSuper
Class hybridLFPy.Population, inherited from class PopulationSuper.
This class rely on spiking times recorded in a network simulation, layer-resolved indegrees, synapse parameters, delay parameters, all per presynaptic population.
Parameters: X : list of str
Each element denote name of presynaptic populations.
networkSim : hybridLFPy.cachednetworks.CachedNetwork object
Container of network spike events resolved per population
k_yXL : numpy.ndarray
Num layers x num presynapse populations array specifying the number of incoming connections per layer and per population type.
synParams : dict of dicts
Synapse parameters (cf. LFPy.Synapse class). Each toplevel key denote each presynaptic population, bottom-level dicts are parameters passed to LFPy.Synapse.
synDelayLoc : list
Average synapse delay for each presynapse connection.
synDelayScale : list
Synapse delay std for each presynapse connection.
J_yX : list of floats
Synapse weights for connections of each presynaptic population, see class LFPy.Synapse
Returns: hybridLFPy.population.Population object
See also
PopulationSuper
,CachedNetwork
,CachedFixedSpikesNetwork
,CachedNoiseNetwork
,LFPy.Cell
,LFPy.RecExtElectrode
Methods
calc_min_cell_interdist
(x, y, z)Calculate cell interdistance from input coordinates. calc_signal_sum
([measure])Superimpose each cell’s contribution to the compound population signal, cellsim
(cellindex[, return_just_cell])Do the actual simulations of LFP, using synaptic spike times from network simulation. collectSingleContribs
([measure])Collect single cell data and save them to HDF5 file. collect_data
()Collect LFPs, CSDs and soma traces from each simulated population, and save to file. draw_rand_pos
(radius, z_min, z_max[, min_r, ...])Draw some random location within radius, z_min, z_max, and constrained by min_r and the minimum cell interdistance. fetchSpCells
(nodes, numSyn)For N (nodes count) nestSim-cells draw POPULATION_SIZE x NTIMES random cell indexes in the population in nodes and broadcast these as SpCell. fetchSynIdxCell
(cell, nidx, synParams)Find possible synaptic placements for each cell As synapses are placed within layers with bounds determined by self.layerBoundaries, it will check this matrix accordingly, and use the probabilities from `self.connProbLayer to distribute. get_all_SpCells
()For each postsynaptic cell existing on this RANK, load or compute get_all_synDelays
()Create and load arrays of connection delays per connection on this rank get_all_synIdx
()Auxilliary function to set up class attributes containing get_synidx
(cellindex)Local function, draw and return synapse locations corresponding to a single cell, using a random seed set as POPULATIONSEED + cellindex. insert_all_synapses
(cellindex, cell)Insert all synaptic events from all presynaptic layers on cell object with index cellindex. insert_synapses
(cell, cellindex, synParams)Insert synapse with parameters`=`synparams on cell=cell, with segment indexes given by idx. run
()Distribute individual cell simulations across ranks. set_pop_soma_pos
()Set pop_soma_pos using draw_rand_pos(). set_rotations
()Append random z-axis rotations for each cell in population. -
cellsim
(cellindex, return_just_cell=False)[source]¶ Do the actual simulations of LFP, using synaptic spike times from network simulation.
Parameters: cellindex : int
cell index between 0 and population size-1.
return_just_cell : bool
If True, return only the LFPy.Cell object if False, run full simulation, return None.
Returns: None or LFPy.Cell object
See also
hybridLFPy.csd
,LFPy.Cell
,LFPy.Synapse
,LFPy.RecExtElectrode
-
fetchSpCells
(nodes, numSyn)[source]¶ For N (nodes count) nestSim-cells draw POPULATION_SIZE x NTIMES random cell indexes in the population in nodes and broadcast these as SpCell.
The returned argument is a list with len = numSyn.size of np.arrays, assumes numSyn is a list
Parameters: nodes : numpy.ndarray, dtype=int
Node # of valid presynaptic neurons.
numSyn : numpy.ndarray, dtype=int
# of synapses per connection.
Returns: SpCells : list
presynaptic network-neuron indices
See also
Population.fetch_all_SpCells
-
fetchSynIdxCell
(cell, nidx, synParams)[source]¶ Find possible synaptic placements for each cell As synapses are placed within layers with bounds determined by self.layerBoundaries, it will check this matrix accordingly, and use the probabilities from `self.connProbLayer to distribute.
For each layer, the synapses are placed with probability normalized by membrane area of each compartment
Parameters: cell : LFPy.Cell instance
nidx : numpy.ndarray
Numbers of synapses per presynaptic population X.
synParams : which LFPy.Synapse parameters to use.
Returns: syn_idx : list
List of arrays of synapse placements per connection.
See also
Population.get_all_synIdx
,Population.get_synIdx
,LFPy.Synapse
-
get_all_SpCells
()[source]¶ For each postsynaptic cell existing on this RANK, load or compute the presynaptic cell index for each synaptic connection
This function takes no kwargs.
Parameters: None
Returns: SpCells : dict
output[cellindex][populationname][layerindex], np.array of presynaptic cell indices.
See also
-
get_all_synDelays
()[source]¶ Create and load arrays of connection delays per connection on this rank
Get random normally distributed synaptic delays, returns dict of nested list of same shape as SpCells.
Delays are rounded to dt.
This function takes no kwargs.
Parameters: None
Returns: dict
output[cellindex][populationname][layerindex]`, np.array of delays per connection.
See also
numpy.random.normal
-
get_all_synIdx
()[source]¶ Auxilliary function to set up class attributes containing synapse locations given as LFPy.Cell compartment indices
This function takes no inputs.
Parameters: None
Returns: synIdx : dict
output[cellindex][populationindex][layerindex] numpy.ndarray of compartment indices.
-
get_synidx
(cellindex)[source]¶ Local function, draw and return synapse locations corresponding to a single cell, using a random seed set as POPULATIONSEED + cellindex.
Parameters: cellindex : int
Index of cell object.
Returns: synidx : dict
LFPy.Cell compartment indices
-
insert_all_synapses
(cellindex, cell)[source]¶ Insert all synaptic events from all presynaptic layers on cell object with index cellindex.
Parameters: cellindex : int
cell index in the population.
cell : LFPy.Cell instance
Postsynaptic target cell.
Returns: None
See also
Population.insert_synapse
-
insert_synapses
(cell, cellindex, synParams, idx=array([], dtype=float64), X='EX', SpCell=array([], dtype=float64), synDelays=None)[source]¶ Insert synapse with parameters`=`synparams on cell=cell, with segment indexes given by idx. SpCell and SpTimes picked from Brunel network simulation
Parameters: cell : LFPy.Cell instance
Postsynaptic target cell.
cellindex : int
Index of cell in population.
synParams : dict
Parameters passed to LFPy.Synapse.
idx : numpy.ndarray
Postsynaptic compartment indices.
X : str
presynaptic population name
SpCell : numpy.ndarray
Presynaptic spiking cells.
synDelays : numpy.ndarray
Per connection specific delays.
Returns: None
See also
-
class PostProcess
¶
-
class
hybridLFPy.
PostProcess
(y=['EX', 'IN'], dt_output=1.0, mapping_Yy=[('EX', 'EX'), ('IN', 'IN')], savelist=['somav', 'somapos', 'x', 'y', 'z', 'LFP', 'CSD'], savefolder='simulation_output_example_brunel', cells_subfolder='cells', populations_subfolder='populations', figures_subfolder='figures', output_file='{}_population_{}', compound_file='{}sum.h5')[source]¶ Bases:
object
class PostProcess: Methods to deal with the contributions of every postsynaptic sub-population.
Parameters: y : list
Postsynaptic cell-type or population-names.
dt_output : float
Time resolution of output data.
savefolder : str
Path to main output folder.
mapping_Yy : list
List of tuples, each tuple pairing population with cell type, e.g., [(‘L4E’, ‘p4’), (‘L4E’, ‘ss4’)].
cells_subfolder : str
Folder under savefolder containing cell output.
populations_subfolder : str
Folder under savefolder containing population specific output.
figures_subfolder : str
Folder under savefolder containing figs.
Methods
calc_csd
()Sum all the CSD contributions from every layer. calc_csd_layer
()Calculate the CSD from concatenated subpopulations residing in a certain layer, e.g all L4E pops are summed, according to the mapping_Yy attribute of the hybridLFPy.Population objects. calc_lfp
()Sum all the LFP contributions from every cell type. calc_lfp_layer
()Calculate the LFP from concatenated subpopulations residing in a certain layer, e.g all L4E pops are summed, according to the mapping_Yy attribute of the hybridLFPy.Population objects. create_tar_archive
()Create a tar archive of the main simulation outputs. run
()Perform the postprocessing steps, computing compound signals from cell-specific output files. -
calc_csd_layer
()[source]¶ Calculate the CSD from concatenated subpopulations residing in a certain layer, e.g all L4E pops are summed, according to the mapping_Yy attribute of the hybridLFPy.Population objects.
-
class GDF
¶
-
class
hybridLFPy.
GDF
(dbname, bsize=1000000, new_db=True, debug=False)[source]¶ Bases:
object
- Read from gdf files.
- Create sqlite db of (neuron, spike time).
- Query spike times for neurons.
Parameters: dbname : str
Filename of sqlite database, see sqlite3.connect
bsize : int
Number of spike times to insert.
new_db : bool
New database with name dbname, will overwrite at a time, determines memory usage.
Returns: hybridLFPy.gdf.GDF object
See also
sqlite3
,sqlite3.connect
,sqlite3.connect.cursor
Methods
close
()Close sqlite3.connect.cursor and sqlite3.connect objects create
([re, index])Create db from list of gdf file glob create_from_list
([re, index])Create db from list of arrays. interval
([T])Get all spikes in a time interval T. neurons
()Return list of neuron indices. num_spikes
()Return total number of spikes. plotstuff
([T])Create a scatter plot of the contents of the database, with entries on the interval T. select
(neurons)Select spike trains. select_neurons_interval
(neurons[, T])Get all spikes from neurons in a time interval T. -
close
()[source]¶ Close sqlite3.connect.cursor and sqlite3.connect objects
Parameters: None Returns: None See also
sqlite3.connect.cursor
,sqlite3.connect
-
create
(re='brunel-py-ex-*.gdf', index=True)[source]¶ Create db from list of gdf file glob
Parameters: re : str
File glob to load.
index : bool
Create index on neurons for speed.
Returns: None
See also
sqlite3.connect.cursor
,sqlite3.connect
-
create_from_list
(re=[], index=True)[source]¶ Create db from list of arrays.
Parameters: re : list
Index of element is cell index, and element i an array of spike times in ms.
index : bool
Create index on neurons for speed.
Returns: None
See also
sqlite3.connect.cursor
,sqlite3.connect
-
interval
(T=[0, 1000])[source]¶ Get all spikes in a time interval T.
Parameters: T : list
Time interval.
Returns: s : list
Nested list with spike times.
See also
sqlite3.connect.cursor
-
neurons
()[source]¶ Return list of neuron indices.
Parameters: None
Returns: list
list of neuron indices
See also
sqlite3.connect.cursor
-
plotstuff
(T=[0, 1000])[source]¶ Create a scatter plot of the contents of the database, with entries on the interval T.
Parameters: T : list
Time interval.
Returns: None
See also
submodule helpers
¶
Documentation:
This is a script containing general helper functions which can be applied to specialized cases.
-
hybridLFPy.helpers.
autocorrfunc
(freq, power)[source]¶ Calculate autocorrelation function(s) for given power spectrum/spectra.
Parameters: freq : numpy.ndarray
1 dimensional array of frequencies.
power : numpy.ndarray
2 dimensional power spectra, 1st axis units, 2nd axis frequencies.
Returns: time : tuple
1 dim numpy.ndarray of times.
autof : tuple
2 dim numpy.ndarray; autocorrelation functions, 1st axis units, 2nd axis times.
-
hybridLFPy.helpers.
calculate_fft
(data, tbin)[source]¶ Function to calculate the Fourier transform of data.
Parameters: data : numpy.ndarray
1D or 2D array containing time series.
tbin : float
Bin size of time series (in ms).
Returns: freqs : numpy.ndarray
Frequency axis of signal in Fourier space.
fft : numpy.ndarray
Signal in Fourier space.
-
hybridLFPy.helpers.
centralize
(data, time=False, units=False)[source]¶ Function to subtract the mean across time and/or across units from data
Parameters: data : numpy.ndarray
1D or 2D array containing time series, 1st index: unit, 2nd index: time
time : bool
True: subtract mean across time.
units : bool
True: subtract mean across units.
Returns: numpy.ndarray
1D or 0D array of centralized signal.
-
hybridLFPy.helpers.
coherence
(freq, power, cross)[source]¶ Calculate frequency resolved coherence for given power- and crossspectra.
Parameters: freq : numpy.ndarray
Frequencies, 1 dim array.
power : numpy.ndarray
Power spectra, 1st axis units, 2nd axis frequencies.
cross : numpy.ndarray,
Cross spectra, 1st axis units, 2nd axis units, 3rd axis frequencies.
Returns: freq: tuple
1 dim numpy.ndarray of frequencies.
coh: tuple
ndim 3 numpy.ndarray of coherences, 1st axis units, 2nd axis units, 3rd axis frequencies.
-
hybridLFPy.helpers.
compound_crossspec
(a_data, tbin, Df=None, pointProcess=False)[source]¶ Calculate cross spectra of compound signals. a_data is a list of datasets (a_data = [data1,data2,...]). For each dataset in a_data, the compound signal is calculated and the crossspectra between these compound signals is computed.
If pointProcess=True, power spectra are normalized by the length T of the time series.
Parameters: a_data : list of numpy.ndarrays
Array: 1st axis unit, 2nd axis time.
tbin : float
Binsize in ms.
Df : float/None,
Window width of sliding rectangular filter (smoothing), None -> no smoothing.
pointProcess : bool
If set to True, crossspectrum is normalized to signal length T
Returns: freq : tuple
numpy.ndarray of frequencies.
CRO : tuple
3 dim numpy.ndarray; 1st axis first compound signal, 2nd axis second compound signal, 3rd axis frequency.
Examples
>>> compound_crossspec([np.array([analog_sig1, analog_sig2]), np.array([analog_sig3,analog_sig4])], tbin, Df=Df) Out[1]: (freq,CRO) >>> CRO.shape Out[2]: (2,2,len(analog_sig1))
-
hybridLFPy.helpers.
compound_mean
(data)[source]¶ Compute the mean of the compound/sum signal. Data is first summed across units and averaged across time.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time
Returns: float
time-averaged compound/sum signal
Examples
>>> compound_mean(np.array([[1, 2, 3], [4, 5, 6]])) 7.0
-
hybridLFPy.helpers.
compound_powerspec
(data, tbin, Df=None, pointProcess=False)[source]¶ Calculate the power spectrum of the compound/sum signal. data is first summed across units, then the power spectrum is calculated.
If pointProcess=True, power spectra are normalized by the length T of the time series.
Parameters: data : numpy.ndarray,
1st axis unit, 2nd axis time
tbin : float,
binsize in ms
Df : float/None,
window width of sliding rectangular filter (smoothing), None -> no smoothing
pointProcess : bool,
if set to True, powerspectrum is normalized to signal length T
Returns: freq : tuple
numpy.ndarray of frequencies
POW : tuple
1 dim numpy.ndarray, frequency series
Examples
>>> compound_powerspec(np.array([analog_sig1, analog_sig2]), tbin, Df=Df) Out[1]: (freq,POW) >>> POW.shape Out[2]: (len(analog_sig1),)
-
hybridLFPy.helpers.
compound_variance
(data)[source]¶ Compute the variance of the compound/sum signal. Data is first summed across units, then the variance across time is calculated.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time
Returns: float
variance across time of compound/sum signal
Examples
>>> compound_variance(np.array([[1, 2, 3], [4, 5, 6]])) 2.6666666666666665
-
hybridLFPy.helpers.
corrcoef
(time, crossf, integration_window=0.0)[source]¶ Calculate the correlation coefficient for given auto- and crosscorrelation functions. Standard settings yield the zero lag correlation coefficient. Setting integration_window > 0 yields the correlation coefficient of integrated auto- and crosscorrelation functions. The correlation coefficient between a zero signal with any other signal is defined as 0.
Parameters: time : numpy.ndarray
1 dim array of times corresponding to signal.
crossf : numpy.ndarray
Crosscorrelation functions, 1st axis first unit, 2nd axis second unit, 3rd axis times.
integration_window: float
Size of the integration window.
Returns: cc : numpy.ndarray
2 dim array of correlation coefficient between two units.
-
hybridLFPy.helpers.
crosscorrfunc
(freq, cross)[source]¶ Calculate crosscorrelation function(s) for given cross spectra.
Parameters: freq : numpy.ndarray
1 dimensional array of frequencies.
cross : numpy.ndarray
2 dimensional array of cross spectra, 1st axis units, 2nd axis units, 3rd axis frequencies.
Returns: time : tuple
1 dim numpy.ndarray of times.
crossf : tuple
3 dim numpy.ndarray, crosscorrelation functions, 1st axis first unit, 2nd axis second unit, 3rd axis times.
-
hybridLFPy.helpers.
crossspec
(data, tbin, Df=None, units=False, pointProcess=False)[source]¶ Calculate (smoothed) cross spectra of data. If `units`=True, cross spectra are averaged across units. Note that averaging is done on cross spectra rather than data.
Cross spectra are normalized by the length T of the time series -> no scaling with T.
If pointProcess=True, power spectra are normalized by the length T of the time series.
Parameters: data : numpy.ndarray,
1st axis unit, 2nd axis time
tbin : float,
binsize in ms
Df : float/None,
window width of sliding rectangular filter (smoothing), None -> no smoothing
units : bool,
average cross spectrum
pointProcess : bool,
if set to True, cross spectrum is normalized to signal length T
Returns: freq : tuple
numpy.ndarray of frequencies
CRO : tuple
Examples
>>> crossspec(np.array([analog_sig1, analog_sig2]), tbin, Df=Df) Out[1]: (freq,CRO) >>> CRO.shape Out[2]: (2,2,len(analog_sig1))
>>> crossspec(np.array([analog_sig1, analog_sig2]), tbin, Df=Df, units=True) Out[1]: (freq,CRO) >>> CRO.shape Out[2]: (len(analog_sig1),)
-
hybridLFPy.helpers.
cv
(data, units=False)[source]¶ Calculate coefficient of variation (cv) of data. Mean and standard deviation are computed across time.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time.
units : bool
Average cv.
Returns: numpy.ndarray
If units=False, series of unit `cv`s.
float
If units=True, mean cv across units.
Examples
>>> cv(np.array([[1, 2, 3, 4, 5, 6], [11, 2, 3, 3, 4, 5]])) array([ 0.48795004, 0.63887656])
>>> cv(np.array([[1, 2, 3, 4, 5, 6], [11, 2, 3, 3, 4, 5]]), units=True) 0.56341330073710316
-
hybridLFPy.helpers.
decimate
(x, q=10, n=4, k=0.8, filterfun=<function cheby1>)[source]¶ scipy.signal.decimate like downsampling using filtfilt instead of lfilter, and filter coeffs from butterworth or chebyshev type 1.
Parameters: x : numpy.ndarray
Array to be downsampled along last axis.
q : int
Downsampling factor.
n : int
Filter order.
k : float
Aliasing filter critical frequency Wn will be set as Wn=k/q.
filterfun : function
scipy.signal.filter_design.cheby1 or scipy.signal.filter_design.butter function
Returns: numpy.ndarray
Array of downsampled signal.
-
hybridLFPy.helpers.
dump_dict_of_nested_lists_to_h5
(fname, data)[source]¶ Take nested list structure and dump it in hdf5 file.
Parameters: fname : str
Filename
data : dict(list(numpy.ndarray))
Dict of nested lists with variable len arrays.
Returns: None
-
hybridLFPy.helpers.
fano
(data, units=False)[source]¶ Calculate fano factor (FF) of data. Mean and variance are computed across time.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time.
units : bool
Average FF.
Returns: numpy.ndarray
If units=False, series of unit FFs.
float
If units=True, mean FF across units.
Examples
>>> fano(np.array([[1, 2, 3, 4, 5, 6], [11, 2, 3, 3, 4, 5]])) array([ 0.83333333, 1.9047619 ])
>>> fano(np.array([[1, 2, 3, 4, 5, 6], [11, 2, 3, 3, 4, 5]]), units=True) 1.3690476190476191
-
hybridLFPy.helpers.
load_dict_of_nested_lists_from_h5
(fname, toplevelkeys=None)[source]¶ Load nested list structure from hdf5 file
Parameters: fname : str
Filename
toplevelkeys : None or iterable,
Load a two(default) or three-layered structure.
Returns: dict(list(numpy.ndarray))
dictionary of nested lists with variable length array data.
-
hybridLFPy.helpers.
load_h5_data
(path='', data_type='LFP', y=None, electrode=None, warmup=0.0, scaling=1.0)[source]¶ Function loading results from hdf5 file
Parameters: path : str
Path to hdf5-file
data_type : str
Signal types in [‘CSD’ , ‘LFP’, ‘CSDsum’, ‘LFPsum’].
y : None or str
Name of population.
electrode : None or int
TODO: update, electrode is NOT USED
warmup : float
Lower cutoff of time series to remove possible transients
scaling : float,
Scaling factor for population size that determines the amount of loaded single-cell signals
Returns: numpy.ndarray
[electrode id, compound signal] if y is None
numpy.ndarray
[cell id, electrode, single-cell signal] otherwise
-
hybridLFPy.helpers.
mean
(data, units=False, time=False)[source]¶ Function to compute mean of data
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time
units : bool
Average over units
time : bool
Average over time
Returns: if units=False and time=False:
error
if units=True:
1 dim numpy.ndarray; time series
if time=True:
1 dim numpy.ndarray; series of unit means across time
if units=True and time=True:
float; unit and time mean
Examples
>>> mean(np.array([[1, 2, 3], [4, 5, 6]]), units=True) array([ 2.5, 3.5, 4.5])
>>> mean(np.array([[1, 2, 3], [4, 5, 6]]), time=True) array([ 2., 5.])
>>> mean(np.array([[1, 2, 3], [4, 5, 6]]), units=True,time=True) 3.5
-
hybridLFPy.helpers.
movav
(y, Dx, dx)[source]¶ Moving average rectangular window filter: calculate average of signal y by using sliding rectangular window of size Dx using binsize dx
Parameters: y : numpy.ndarray
Signal
Dx : float
Window length of filter.
dx : float
Bin size of signal sampling.
Returns: numpy.ndarray
Filtered signal.
-
hybridLFPy.helpers.
normalize
(data)[source]¶ Function to normalize data to have mean 0 and unity standard deviation (also called z-transform)
Parameters: data : numpy.ndarray
Returns: numpy.ndarray
z-transform of input array
-
hybridLFPy.helpers.
powerspec
(data, tbin, Df=None, units=False, pointProcess=False)[source]¶ Calculate (smoothed) power spectra of all timeseries in data. If units=True, power spectra are averaged across units. Note that averaging is done on power spectra rather than data.
If pointProcess is True, power spectra are normalized by the length T of the time series.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time.
tbin : float
Binsize in ms.
Df : float/None,
Window width of sliding rectangular filter (smoothing), None is no smoothing.
units : bool
Average power spectrum.
pointProcess : bool
If set to True, powerspectrum is normalized to signal length T.
Returns: freq : tuple
numpy.ndarray of frequencies.
POW : tuple
- if units=False:
2 dim numpy.ndarray; 1st axis unit, 2nd axis frequency
- if units=True:
1 dim numpy.ndarray; frequency series
Examples
>>> powerspec(np.array([analog_sig1, analog_sig2]), tbin, Df=Df) Out[1]: (freq,POW) >>> POW.shape Out[2]: (2,len(analog_sig1))
>>> powerspec(np.array([analog_sig1, analog_sig2]), tbin, Df=Df, units=True) Out[1]: (freq,POW) >>> POW.shape Out[2]: (len(analog_sig1),)
-
hybridLFPy.helpers.
read_gdf
(fname)[source]¶ Fast line-by-line gdf-file reader.
Parameters: fname : str
Path to gdf-file.
Returns: numpy.ndarray
([gid, val0, val1, **]), dtype=object) mixed datatype array
-
hybridLFPy.helpers.
setup_file_dest
(params, clearDestination=True)[source]¶ Function to set up the file catalog structure for simulation output
Parameters: params : object
e.g., cellsim16popsParams.multicompartment_params()
clear_dest : bool
Savefolder will be cleared if already existing.
Returns: None
-
hybridLFPy.helpers.
variance
(data, units=False, time=False)[source]¶ Compute the variance of data across time, units or both.
Parameters: data : numpy.ndarray
1st axis unit, 2nd axis time.
units : bool
Variance across units
time : bool
Average over time
Returns: if units=False and time=False:
Exception
if units=True:
1 dim numpy.ndarray; time series
if time=True:
1 dim numpy.ndarray; series of single unit variances across time
if units=True and time=True:
float; mean of single unit variances across time
Examples
>>> variance(np.array([[1, 2, 3],[4, 5, 6]]), units=True) array([ 2.25, 2.25, 2.25])
>>> variance(np.array([[1, 2, 3], [4, 5, 6]]), time=True) array([ 0.66666667, 0.66666667])
>>> variance(np.array([[1, 2, 3], [4, 5, 6]]), units=True, time=True) 0.66666666666666663
submodule csd
¶
Function true_lam_csd specification for calculation of true laminar CSD from the current-distribution on LFPy.cell.Cell objects, assuming line sources for each individual compartment, including the soma.
-
hybridLFPy.csd.
true_lam_csd
(cell, dr=100, z=None)[source]¶ Return CSD from membrane currents as function along the coordinates of the electrode along z-axis.
Parameters: cell : LFPy.cell.Cell or LFPy.cell.TemplateCell object.
Cell.
dr : float
Radius of the cylindrical CSD volume.
z : numpy.ndarray
Z-coordinates of electrode.
Returns: CSD : numpy.ndarray
Current-source density (in pA * mum^-3).