io

Dpy(fname[, mode, compression])

Methods

load_pickle(fname)

Load object from pickle file fname.

orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

read_bvec_file(filename[, atol])

Read gradient table information from a pair of files with extentions .bvec and .bval.

reorient_on_axis(input, current_ornt, new_ornt)

reorient_vectors(input, current_ornt, new_ornt)

Changes the orientation of a gradients or other vectors

save_pickle(fname, dix)

Save dix to fname as pickle.

Module: io.bvectxt

orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvec_file(filename[, atol])

Read gradient table information from a pair of files with extentions .bvec and .bval.

reorient_on_axis(input, current_ornt, new_ornt)

reorient_vectors(input, current_ornt, new_ornt)

Changes the orientation of a gradients or other vectors

splitext(p)

Split the extension from a pathname.

Module: io.dpy

A class for handling large tractography datasets.

It is built using the h5py which in turn implement key features of the HDF5 (hierachical data format) API [1].

Dpy(fname[, mode, compression])

Methods

Streamlines

alias of nibabel.streamlines.array_sequence.ArraySequence

Module: io.gradients

InTemporaryDirectory([suffix, prefix, dir])

Create, return, and change directory to a temporary directory

read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

splitext(p)

Split the extension from a pathname.

Module: io.image

load_nifti(fname[, return_img, …])

Load data and other information from a nifti file.

load_nifti_data(fname[, as_ndarray])

Load only the data array from a nifti file.

save_nifti(fname, data, affine[, hdr])

Save a data array into a nifti file.

save_qa_metric(fname, xopt, fopt)

Save Quality Assurance metrics.

Module: io.peaks

PeaksAndMetrics

Attributes

Sphere([x, y, z, theta, phi, xyz, faces, edges])

Points on the unit sphere.

load_peaks(fname[, verbose])

Load a PeaksAndMetrics HDF5 file (PAM5)

peaks_to_niftis(pam, fname_shm, fname_dirs, …)

Save SH, directions, indices and values of peaks to Nifti.

reshape_peaks_for_visualization(peaks)

Reshape peaks for visualization.

save_nifti(fname, data, affine[, hdr])

Save a data array into a nifti file.

save_peaks(fname, pam[, affine, verbose])

Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).

Module: io.pickles

Load and save pickles

load_pickle(fname)

Load object from pickle file fname.

save_pickle(fname, dix)

Save dix to fname as pickle.

Module: io.stateful_tractogram

Origin

Enum to simplify future change to convention

PerArrayDict([n_rows])

Dictionary for which key access can do slicing on the values.

PerArraySequenceDict([n_rows])

Dictionary for which key access can do slicing on the values.

Space

Enum to simplify future change to convention

StatefulTractogram(streamlines, reference, space)

Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy).

Streamlines

alias of nibabel.streamlines.array_sequence.ArraySequence

Tractogram([streamlines, …])

Container for streamlines and their data information.

product

product(*iterables, repeat=1) –> product object

apply_affine(aff, pts)

Apply affine matrix aff to points pts

bisect()

bisect_right(a, x[, lo[, hi]]) -> index

deepcopy(x[, memo, _nil])

Deep copy operation on arbitrary Python objects.

get_reference_info(reference)

Will compare the spatial attribute of 2 references

is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

is_reference_info_valid(affine, dimensions, …)

Validate basic data type and value of spatial attribute.

set_sft_logger_level(log_level)

Change the logger of the StatefulTractogram to one on the following: DEBUG, INFO, WARNING, CRITICAL, ERROR

Module: io.streamline

Dpy(fname[, mode, compression])

Methods

Origin

Enum to simplify future change to convention

Space

Enum to simplify future change to convention

StatefulTractogram(streamlines, reference, space)

Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy).

Tractogram([streamlines, …])

Container for streamlines and their data information.

create_tractogram_header(tractogram_type, …)

Write a standard trk/tck header from spatial attribute

deepcopy(x[, memo, _nil])

Deep copy operation on arbitrary Python objects.

detect_format(fileobj)

Returns the StreamlinesFile object guessed from the file-like object.

is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

load_dpy(filename, reference[, to_space, …])

Load the stateful tractogram of the .dpy format

load_fib(filename, reference[, to_space, …])

Load the stateful tractogram of the .fib format

load_generator(ttype)

Generate a loading function that performs a file extension check to restrict the user to a single file format.

load_tck(filename, reference[, to_space, …])

Load the stateful tractogram of the .tck format

load_tractogram(filename, reference[, …])

Load the stateful tractogram from any format (trk, tck, vtk, fib, dpy)

load_trk(filename, reference[, to_space, …])

Load the stateful tractogram of the .trk format

load_vtk(filename, reference[, to_space, …])

Load the stateful tractogram of the .vtk format

load_vtk_streamlines(filename[, to_lps])

Load streamlines from vtk polydata.

save_dpy(sft, filename[, bbox_valid_check])

Save the stateful tractogram of the .dpy format

save_fib(sft, filename[, bbox_valid_check])

Save the stateful tractogram of the .fib format

save_generator(ttype)

Generate a saving function that performs a file extension check to restrict the user to a single file format.

save_tck(sft, filename[, bbox_valid_check])

Save the stateful tractogram of the .tck format

save_tractogram(sft, filename[, …])

Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy)

save_trk(sft, filename[, bbox_valid_check])

Save the stateful tractogram of the .trk format

save_vtk(sft, filename[, bbox_valid_check])

Save the stateful tractogram of the .vtk format

save_vtk_streamlines(streamlines, filename)

Save streamlines as vtk polydata to a supported format file.

Module: io.utils

Utility functions for file formats

Nifti1Image(dataobj, affine[, header, …])

Class for single file NIfTI1 format image

create_nifti_header(affine, dimensions, …)

Write a standard nifti header from spatial attribute

create_tractogram_header(tractogram_type, …)

Write a standard trk/tck header from spatial attribute

decfa(img_orig[, scale])

Create a nifti-compliant directional-encoded color FA image.

decfa_to_float(img_orig)

Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution.

detect_format(fileobj)

Returns the StreamlinesFile object guessed from the file-like object.

get_reference_info(reference)

Will compare the spatial attribute of 2 references

is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

is_reference_info_valid(affine, dimensions, …)

Validate basic data type and value of spatial attribute.

make5d(input)

reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession

nifti1_symmat(image_data, \*args, \*\*kwargs)

Returns a Nifti1Image with a symmetric matrix intent

optional_package(name[, trip_msg])

Return package-like thing and module setup for package name

save_buan_profiles_hdf5(fname, dt)

Saves the given input dataframe to .h5 file

Module: io.vtk

load_polydata(file_name)

Load a vtk polydata to a supported format file.

load_vtk_streamlines(filename[, to_lps])

Load streamlines from vtk polydata.

optional_package(name[, trip_msg])

Return package-like thing and module setup for package name

save_polydata(polydata, file_name[, binary, …])

Save a vtk polydata to a supported format file.

save_vtk_streamlines(streamlines, filename)

Save streamlines as vtk polydata to a supported format file.

setup_module()

transform_streamlines(streamlines, mat[, …])

Apply affine transformation to streamlines

Dpy

class dipy.io.Dpy(fname, mode='r', compression=0)

Bases: object

Methods

read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

close

version

__init__(self, fname, mode='r', compression=0)

Advanced storage system for tractography based on HDF5

Parameters
fnamestr, full filename
mode‘r’ read

‘w’ write ‘r+’ read and write only if file already exists

compression0 no compression to 9 maximum compression

Examples

>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
...     fd,fname = mkstemp()
...     fname += '.dpy'#add correct extension
...     dpw = Dpy(fname,'w')
...     A=np.ones((5,3))
...     B=2*A.copy()
...     C=3*A.copy()
...     dpw.write_track(A)
...     dpw.write_track(B)
...     dpw.write_track(C)
...     dpw.close()
...     dpr = Dpy(fname,'r')
...     dpr.read_track()
...     dpr.read_track()
...     dpr.read_tracksi([0, 1, 2, 0, 0, 2])
...     dpr.close()
...     os.remove(fname) #delete file from disk
>>> dpy_example()
close(self)
read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

version(self)
write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

load_pickle

dipy.io.load_pickle(fname)

Load object from pickle file fname.

Parameters
fnamestr

filename to load dict or other python object

Returns
dixobject

dictionary or other object

Examples

dipy.io.pickles.save_pickle

orientation_from_string

dipy.io.orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string

dipy.io.orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping

dipy.io.ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvals_bvecs

dipy.io.read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

Parameters
fbvalsstr

Full path to file with b-values. None to not read bvals.

fbvecsstr

Full path of file with b-vectors. None to not read bvecs.

Returns
bvalsarray, (N,) or None
bvecsarray, (N, 3) or None

Notes

Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).

read_bvec_file

dipy.io.read_bvec_file(filename, atol=0.001)

Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.

Parameters
filename :

The path to the either the bvec or bval file

atolfloat, optional

The tolorance used to check all the gradient directions are normalized. Defult is .001

reorient_on_axis

dipy.io.reorient_on_axis(input, current_ornt, new_ornt, axis=0)

reorient_vectors

dipy.io.reorient_vectors(input, current_ornt, new_ornt, axis=0)

Changes the orientation of a gradients or other vectors

Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.

R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior

Examples

>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
       [2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1,  1],
       [-1, -2,  3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
       [-1, -2],
       [ 1,  3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
       [ 1,  3],
       [-1, -2]])

save_pickle

dipy.io.save_pickle(fname, dix)

Save dix to fname as pickle.

Parameters
fnamestr

filename to save object e.g. a dictionary

dixstr

dictionary or other object

Examples

>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)

We remove the temporary file we created for neatness

>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)

orientation_from_string

dipy.io.bvectxt.orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string

dipy.io.bvectxt.orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping

dipy.io.bvectxt.ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvec_file

dipy.io.bvectxt.read_bvec_file(filename, atol=0.001)

Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.

Parameters
filename :

The path to the either the bvec or bval file

atolfloat, optional

The tolorance used to check all the gradient directions are normalized. Defult is .001

reorient_on_axis

dipy.io.bvectxt.reorient_on_axis(input, current_ornt, new_ornt, axis=0)

reorient_vectors

dipy.io.bvectxt.reorient_vectors(input, current_ornt, new_ornt, axis=0)

Changes the orientation of a gradients or other vectors

Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.

R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior

Examples

>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
       [2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1,  1],
       [-1, -2,  3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
       [-1, -2],
       [ 1,  3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
       [ 1,  3],
       [-1, -2]])

splitext

dipy.io.bvectxt.splitext(p)

Split the extension from a pathname.

Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.

Dpy

class dipy.io.dpy.Dpy(fname, mode='r', compression=0)

Bases: object

Methods

read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

close

version

__init__(self, fname, mode='r', compression=0)

Advanced storage system for tractography based on HDF5

Parameters
fnamestr, full filename
mode‘r’ read

‘w’ write ‘r+’ read and write only if file already exists

compression0 no compression to 9 maximum compression

Examples

>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
...     fd,fname = mkstemp()
...     fname += '.dpy'#add correct extension
...     dpw = Dpy(fname,'w')
...     A=np.ones((5,3))
...     B=2*A.copy()
...     C=3*A.copy()
...     dpw.write_track(A)
...     dpw.write_track(B)
...     dpw.write_track(C)
...     dpw.close()
...     dpr = Dpy(fname,'r')
...     dpr.read_track()
...     dpr.read_track()
...     dpr.read_tracksi([0, 1, 2, 0, 0, 2])
...     dpr.close()
...     os.remove(fname) #delete file from disk
>>> dpy_example()
close(self)
read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

version(self)
write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

Streamlines

dipy.io.dpy.Streamlines

alias of nibabel.streamlines.array_sequence.ArraySequence

InTemporaryDirectory

class dipy.io.gradients.InTemporaryDirectory(suffix='', prefix='tmp', dir=None)

Bases: nibabel.tmpdirs.TemporaryDirectory

Create, return, and change directory to a temporary directory

Examples

>>> import os
>>> my_cwd = os.getcwd()
>>> with InTemporaryDirectory() as tmpdir:
...     _ = open('test.txt', 'wt').write('some text')
...     assert os.path.isfile('test.txt')
...     assert os.path.isfile(os.path.join(tmpdir, 'test.txt'))
>>> os.path.exists(tmpdir)
False
>>> os.getcwd() == my_cwd
True

Methods

cleanup

__init__(self, suffix='', prefix='tmp', dir=None)

Initialize self. See help(type(self)) for accurate signature.

read_bvals_bvecs

dipy.io.gradients.read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

Parameters
fbvalsstr

Full path to file with b-values. None to not read bvals.

fbvecsstr

Full path of file with b-vectors. None to not read bvecs.

Returns
bvalsarray, (N,) or None
bvecsarray, (N, 3) or None

Notes

Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).

splitext

dipy.io.gradients.splitext(p)

Split the extension from a pathname.

Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.

load_nifti

dipy.io.image.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False, as_ndarray=True)

Load data and other information from a nifti file.

Parameters
fnamestr

Full path to a nifti file.

return_imgbool, optional

Whether to return the nibabel nifti img object. Default: False

return_voxsize: bool, optional

Whether to return the nifti header zooms. Default: False

return_coordsbool, optional

Whether to return the nifti header aff2axcodes. Default: False

as_ndarray: bool, optional

convert nibabel ArrayProxy to a numpy.ndarray. If you want to save memory and delay this casting, just turn this option to False (default: True)

Returns
A tuple, with (at the most, if all keyword args are set to True):
(data, img.affine, img, vox_size, nib.aff2axcodes(img.affine))

See also

load_nifti_data

load_nifti_data

dipy.io.image.load_nifti_data(fname, as_ndarray=True)

Load only the data array from a nifti file.

Parameters
fnamestr

Full path to the file.

as_ndarray: bool, optional

convert nibabel ArrayProxy to a numpy.ndarray. If you want to save memory and delay this casting, just turn this option to False (default: True)

Returns
data: np.ndarray or nib.ArrayProxy

See also

load_nifti

save_nifti

dipy.io.image.save_nifti(fname, data, affine, hdr=None)

Save a data array into a nifti file.

Parameters
fnamestr

The full path to the file to be saved.

datandarray

The array with the data to save.

affine4x4 array

The affine transform associated with the file.

hdrnifti header, optional

May contain additional information to store in the file header.

Returns
None

save_qa_metric

dipy.io.image.save_qa_metric(fname, xopt, fopt)

Save Quality Assurance metrics.

Parameters
fname: string

File name to save the metric values.

xopt: numpy array

The metric containing the optimal parameters for image registration.

fopt: int

The distance between the registered images.

PeaksAndMetrics

class dipy.io.peaks.PeaksAndMetrics

Bases: dipy.reconst.eudx_direction_getter.EuDXDirectionGetter

Attributes
ang_thr
qa_thr
total_weight

Methods

initial_direction()

The best starting directions for fiber tracking from point

get_direction

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

Sphere

class dipy.io.peaks.Sphere(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)

Bases: object

Points on the unit sphere.

The sphere can be constructed using one of three conventions:

Sphere(x, y, z)
Sphere(xyz=xyz)
Sphere(theta=theta, phi=phi)
Parameters
x, y, z1-D array_like

Vertices as x-y-z coordinates.

theta, phi1-D array_like

Vertices as spherical coordinates. Theta and phi are the inclination and azimuth angles respectively.

xyz(N, 3) ndarray

Vertices as x-y-z coordinates.

faces(N, 3) ndarray

Indices into vertices that form triangular faces. If unspecified, the faces are computed using a Delaunay triangulation.

edges(N, 2) ndarray

Edges between vertices. If unspecified, the edges are derived from the faces.

Attributes
x
y
z

Methods

find_closest(self, xyz)

Find the index of the vertex in the Sphere closest to the input vector

subdivide(self[, n])

Subdivides each face of the sphere into four new faces.

edges

faces

vertices

__init__(self, x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)

Initialize self. See help(type(self)) for accurate signature.

edges(self)
faces(self)
find_closest(self, xyz)

Find the index of the vertex in the Sphere closest to the input vector

Parameters
xyzarray-like, 3 elements

A unit vector

Returns
idxint

The index into the Sphere.vertices array that gives the closest vertex (in angle).

subdivide(self, n=1)

Subdivides each face of the sphere into four new faces.

New vertices are created at a, b, and c. Then each face [x, y, z] is divided into faces [x, a, c], [y, a, b], [z, b, c], and [a, b, c].

   y
   /               /               a/____
/\    /            /  \  /             /____\/____          x      c     z
Parameters
nint, optional

The number of subdivisions to preform.

Returns
new_sphereSphere

The subdivided sphere.

vertices(self)
property x
property y
property z

load_peaks

dipy.io.peaks.load_peaks(fname, verbose=False)

Load a PeaksAndMetrics HDF5 file (PAM5)

Parameters
fnamestring

Filename of PAM5 file.

verbosebool

Print summary information about the loaded file.

Returns
pamPeaksAndMetrics object

peaks_to_niftis

dipy.io.peaks.peaks_to_niftis(pam, fname_shm, fname_dirs, fname_values, fname_indices, fname_gfa, reshape_dirs=False)

Save SH, directions, indices and values of peaks to Nifti.

reshape_peaks_for_visualization

dipy.io.peaks.reshape_peaks_for_visualization(peaks)

Reshape peaks for visualization.

Reshape and convert to float32 a set of peaks for visualisation with mrtrix or the fibernavigator.

Parameters
peaks: nd array (…, N, 3) or PeaksAndMetrics object

The peaks to be reshaped and converted to float32.

Returns
peaksnd array (…, 3*N)

save_nifti

dipy.io.peaks.save_nifti(fname, data, affine, hdr=None)

Save a data array into a nifti file.

Parameters
fnamestr

The full path to the file to be saved.

datandarray

The array with the data to save.

affine4x4 array

The affine transform associated with the file.

hdrnifti header, optional

May contain additional information to store in the file header.

Returns
None

save_peaks

dipy.io.peaks.save_peaks(fname, pam, affine=None, verbose=False)

Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).

Parameters
fnamestring

Filename of PAM5 file

pamPeaksAndMetrics

Object holding peak_dirs, shm_coeffs and other attributes

affinearray

The 4x4 matrix transforming the date from native to world coordinates. PeaksAndMetrics should have that attribute but if not it can be provided here. Default None.

verbosebool

Print summary information about the saved file.

load_pickle

dipy.io.pickles.load_pickle(fname)

Load object from pickle file fname.

Parameters
fnamestr

filename to load dict or other python object

Returns
dixobject

dictionary or other object

Examples

dipy.io.pickles.save_pickle

save_pickle

dipy.io.pickles.save_pickle(fname, dix)

Save dix to fname as pickle.

Parameters
fnamestr

filename to save object e.g. a dictionary

dixstr

dictionary or other object

Examples

>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)

We remove the temporary file we created for neatness

>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)

Origin

class dipy.io.stateful_tractogram.Origin

Bases: enum.Enum

Enum to simplify future change to convention

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

NIFTI = 'center'
TRACKVIS = 'corner'

PerArrayDict

class dipy.io.stateful_tractogram.PerArrayDict(n_rows=0, *args, **kwargs)

Bases: nibabel.streamlines.tractogram.SliceableDataDict

Dictionary for which key access can do slicing on the values.

This container behaves like a standard dictionary but extends key access to allow keys for key access to be indices slicing into the contained ndarray values. The elements must also be ndarrays.

In addition, it makes sure the amount of data contained in those ndarrays matches the number of streamlines given at the instantiation of this instance.

Parameters
n_rowsNone or int, optional

Number of rows per value in each key, value pair or None for not specified.

*args :
**kwargs :

Positional and keyword arguments, passed straight through the dict constructor.

Methods

clear(self)

extend(self, other)

Appends the elements of another PerArrayDict.

get(self, key[, default])

items(self)

keys(self)

pop(self, key[, default])

If key is not found, d is returned if given, otherwise KeyError is raised.

popitem(self)

as a 2-tuple; but raise KeyError if D is empty.

setdefault(self, key[, default])

update(\*args, \*\*kwds)

If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v

values(self)

__init__(self, n_rows=0, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

extend(self, other)

Appends the elements of another PerArrayDict.

That is, for each entry in this dictionary, we append the elements coming from the other dictionary at the corresponding entry.

Parameters
otherPerArrayDict object

Its data will be appended to the data of this dictionary.

Returns
None

Notes

The keys in both dictionaries must be the same.

PerArraySequenceDict

class dipy.io.stateful_tractogram.PerArraySequenceDict(n_rows=0, *args, **kwargs)

Bases: nibabel.streamlines.tractogram.PerArrayDict

Dictionary for which key access can do slicing on the values.

This container behaves like a standard dictionary but extends key access to allow keys for key access to be indices slicing into the contained ndarray values. The elements must also be ArraySequence.

In addition, it makes sure the amount of data contained in those array sequences matches the number of elements given at the instantiation of the instance.

Methods

clear(self)

extend(self, other)

Appends the elements of another PerArrayDict.

get(self, key[, default])

items(self)

keys(self)

pop(self, key[, default])

If key is not found, d is returned if given, otherwise KeyError is raised.

popitem(self)

as a 2-tuple; but raise KeyError if D is empty.

setdefault(self, key[, default])

update(\*args, \*\*kwds)

If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v

values(self)

__init__(self, n_rows=0, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

Space

class dipy.io.stateful_tractogram.Space

Bases: enum.Enum

Enum to simplify future change to convention

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

RASMM = 'rasmm'
VOX = 'vox'
VOXMM = 'voxmm'

StatefulTractogram

class dipy.io.stateful_tractogram.StatefulTractogram(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)

Bases: object

Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.

Attributes
affine

Getter for the reference affine

data_per_point

Getter for data_per_point

data_per_streamline

Getter for data_per_streamline

dimensions

Getter for the reference dimensions

origin

Getter for origin standard

space

Getter for the current space

space_attributes

Getter for spatial attribute

streamlines

Partially safe getter for streamlines

voxel_order

Getter for the reference voxel order

voxel_sizes

Getter for the reference voxel sizes

Methods

are_compatible(sft_1, sft_2)

Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency

compute_bounding_box(self)

Compute the bounding box of the streamlines in their current state

from_sft(streamlines, sft[, data_per_point, …])

Create an instance of StatefulTractogram from another instance of StatefulTractogram.

get_data_per_point_keys(self)

Return a list of the data_per_point attribute names

get_data_per_streamline_keys(self)

Return a list of the data_per_streamline attribute names

get_streamlines_copy(self)

Safe getter for streamlines (for slicing)

is_bbox_in_vox_valid(self)

Verify that the bounding box is valid in voxel space.

remove_invalid_streamlines(self[, epsilon])

Remove streamlines with invalid coordinates from the object.

to_center(self)

Safe function to shift streamlines so the center of voxel is the origin

to_corner(self)

Safe function to shift streamlines so the corner of voxel is the origin

to_origin(self, target_origin)

Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)

to_rasmm(self)

Safe function to transform streamlines and update state

to_space(self, target_space)

Safe function to transform streamlines to a particular space using an enum and update state

to_vox(self)

Safe function to transform streamlines and update state

to_voxmm(self)

Safe function to transform streamlines and update state

__init__(self, streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)

Create a strict, state-aware, robust tractogram

Parameters
streamlineslist or ArraySequence

Streamlines of the tractogram

referenceNifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attributes. Typically a nifti-related object from the native diffusion used for streamlines generation

spaceEnum (dipy.io.stateful_tractogram.Space)

Current space in which the streamlines are (vox, voxmm or rasmm) After tracking the space is VOX, after loading with nibabel the space is RASMM

originEnum (dipy.io.stateful_tractogram.Origin), optional

Current origin in which the streamlines are (center or corner) After loading with nibabel the origin is CENTER

data_per_pointdict, optional

Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i

data_per_streamlinedict, optional

Dictionary in which each key has X items X being the number of streamlines

Notes

Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.

Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.

In a case of manipulation not allowed by this object, use Nibabel directly and be careful.

property affine

Getter for the reference affine

static are_compatible(sft_1, sft_2)

Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency

compute_bounding_box(self)

Compute the bounding box of the streamlines in their current state

Returns
outputndarray

8 corners of the XYZ aligned box, all zeros if no streamlines

property data_per_point

Getter for data_per_point

property data_per_streamline

Getter for data_per_streamline

property dimensions

Getter for the reference dimensions

static from_sft(streamlines, sft, data_per_point=None, data_per_streamline=None)

Create an instance of StatefulTractogram from another instance of StatefulTractogram.

Parameters
streamlineslist or ArraySequence

Streamlines of the tractogram

sftStatefulTractgram,

The other StatefulTractgram to copy the space_attribute AND state from.

data_per_pointdict, optional

Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i

data_per_streamlinedict, optional

Dictionary in which each key has X items X being the number of streamlines

—–
get_data_per_point_keys(self)

Return a list of the data_per_point attribute names

get_data_per_streamline_keys(self)

Return a list of the data_per_streamline attribute names

get_streamlines_copy(self)

Safe getter for streamlines (for slicing)

is_bbox_in_vox_valid(self)

Verify that the bounding box is valid in voxel space. Negative coordinates or coordinates above the volume dimensions are considered invalid in voxel space.

Returns
outputbool

Are the streamlines within the volume of the associated reference

property origin

Getter for origin standard

remove_invalid_streamlines(self, epsilon=0.001)

Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero

Parameters
epsilonfloat (optional)

Epsilon value for the bounding box verification. Default is 1e-6.

Returns
outputtuple

Tuple of two list, indices_to_remove, indices_to_keep

property space

Getter for the current space

property space_attributes

Getter for spatial attribute

property streamlines

Partially safe getter for streamlines

to_center(self)

Safe function to shift streamlines so the center of voxel is the origin

to_corner(self)

Safe function to shift streamlines so the corner of voxel is the origin

to_origin(self, target_origin)

Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)

to_rasmm(self)

Safe function to transform streamlines and update state

to_space(self, target_space)

Safe function to transform streamlines to a particular space using an enum and update state

to_vox(self)

Safe function to transform streamlines and update state

to_voxmm(self)

Safe function to transform streamlines and update state

property voxel_order

Getter for the reference voxel order

property voxel_sizes

Getter for the reference voxel sizes

Streamlines

dipy.io.stateful_tractogram.Streamlines

alias of nibabel.streamlines.array_sequence.ArraySequence

Tractogram

class dipy.io.stateful_tractogram.Tractogram(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)

Bases: object

Container for streamlines and their data information.

Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) [1]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.

References

[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces [2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space

Attributes
streamlinesArraySequence object

Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).

data_per_streamlinePerArrayDict object

Dictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).

data_per_pointPerArraySequenceDict object

Dictionary where the items are (str, ArraySequence). Each key represents a piece of information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number values to store for that particular piece of information \(i\).

Methods

apply_affine(self, affine[, lazy])

Applies an affine transformation on the points of each streamline.

copy(self)

Returns a copy of this Tractogram object.

extend(self, other)

Appends the data of another Tractogram.

to_world(self[, lazy])

Brings the streamlines to world space (i.e.

__init__(self, streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)
Parameters
streamlinesiterable of ndarrays or ArraySequence, optional

Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).

data_per_streamlinedict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).

data_per_pointdict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).

affine_to_rasmmndarray of shape (4, 4) or None, optional

Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.

property affine_to_rasmm

Affine bringing streamlines in this tractogram to RAS+mm.

apply_affine(self, affine, lazy=False)

Applies an affine transformation on the points of each streamline.

If lazy is not specified, this is performed in-place.

Parameters
affinendarray of shape (4, 4)

Transformation that will be applied to every streamline.

lazy{False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns
tractogramTractogram or LazyTractogram object

Tractogram where the streamlines have been transformed according to the given affine transformation. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

copy(self)

Returns a copy of this Tractogram object.

property data_per_point
property data_per_streamline
extend(self, other)

Appends the data of another Tractogram.

Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.

Parameters
otherTractogram object

Its data will be appended to the data of this tractogram.

Returns
None

Notes

The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.

property streamlines
to_world(self, lazy=False)

Brings the streamlines to world space (i.e. RAS+ and mm).

If lazy is not specified, this is performed in-place.

Parameters
lazy{False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns
tractogramTractogram or LazyTractogram object

Tractogram where the streamlines have been sent to world space. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

product

class dipy.io.stateful_tractogram.product

Bases: object

product(*iterables, repeat=1) –> product object

Cartesian product of input iterables. Equivalent to nested for-loops.

For example, product(A, B) returns the same as: ((x,y) for x in A for y in B). The leftmost iterators are in the outermost for-loop, so the output tuples cycle in a manner similar to an odometer (with the rightmost element changing on every iteration).

To compute the product of an iterable with itself, specify the number of repetitions with the optional repeat keyword argument. For example, product(A, repeat=4) means the same as product(A, A, A, A).

product(‘ab’, range(3)) –> (‘a’,0) (‘a’,1) (‘a’,2) (‘b’,0) (‘b’,1) (‘b’,2) product((0,1), (0,1), (0,1)) –> (0,0,0) (0,0,1) (0,1,0) (0,1,1) (1,0,0) …

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

apply_affine

dipy.io.stateful_tractogram.apply_affine(aff, pts)

Apply affine matrix aff to points pts

Returns result of application of aff to the right of pts. The coordinate dimension of pts should be the last.

For the 3D case, aff will be shape (4,4) and pts will have final axis length 3 - maybe it will just be N by 3. The return value is the transformed points, in this case:

res = np.dot(aff[:3,:3], pts.T) + aff[:3,3:4]
transformed_pts = res.T

This routine is more general than 3D, in that aff can have any shape (N,N), and pts can have any shape, as long as the last dimension is for the coordinates, and is therefore length N-1.

Parameters
aff(N, N) array-like

Homogenous affine, for 3D points, will be 4 by 4. Contrary to first appearance, the affine will be applied on the left of pts.

pts(…, N-1) array-like

Points, where the last dimension contains the coordinates of each point. For 3D, the last dimension will be length 3.

Returns
transformed_pts(…, N-1) array

transformed points

Examples

>>> aff = np.array([[0,2,0,10],[3,0,0,11],[0,0,4,12],[0,0,0,1]])
>>> pts = np.array([[1,2,3],[2,3,4],[4,5,6],[6,7,8]])
>>> apply_affine(aff, pts) 
array([[14, 14, 24],
       [16, 17, 28],
       [20, 23, 36],
       [24, 29, 44]]...)

Just to show that in the simple 3D case, it is equivalent to:

>>> (np.dot(aff[:3,:3], pts.T) + aff[:3,3:4]).T 
array([[14, 14, 24],
       [16, 17, 28],
       [20, 23, 36],
       [24, 29, 44]]...)

But pts can be a more complicated shape:

>>> pts = pts.reshape((2,2,3))
>>> apply_affine(aff, pts) 
array([[[14, 14, 24],
        [16, 17, 28]],

       [[20, 23, 36],
        [24, 29, 44]]]...)

bisect

dipy.io.stateful_tractogram.bisect()

bisect_right(a, x[, lo[, hi]]) -> index

Return the index where to insert item x in list a, assuming a is sorted.

The return value i is such that all e in a[:i] have e <= x, and all e in a[i:] have e > x. So if x already appears in the list, i points just beyond the rightmost x already there

Optional args lo (default 0) and hi (default len(a)) bound the slice of a to be searched.

deepcopy

dipy.io.stateful_tractogram.deepcopy(x, memo=None, _nil=[])

Deep copy operation on arbitrary Python objects.

See the module’s __doc__ string for more info.

get_reference_info

dipy.io.stateful_tractogram.get_reference_info(reference)

Will compare the spatial attribute of 2 references

Parameters
referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict) Reference that provides the spatial attribute.

Returns
outputtuple
  • affine ndarray (4,4), np.float32, tranformation of VOX to RASMM

  • dimensions ndarray (3,), int16, volume shape for each axis

  • voxel_sizes ndarray (3,), float32, size of voxel for each axis

  • voxel_order, string, Typically ‘RAS’ or ‘LPS’

is_header_compatible

dipy.io.stateful_tractogram.is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

Parameters
reference_1Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

reference_2Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

Returns
outputbool

Does all the spatial attribute match

is_reference_info_valid

dipy.io.stateful_tractogram.is_reference_info_valid(affine, dimensions, voxel_sizes, voxel_order)

Validate basic data type and value of spatial attribute.

Does not ensure that voxel_sizes and voxel_order are self-coherent with the affine. Only verify the following:

  • affine is of the right type (float) and dimension (4,4)

  • affine contain values in the rotation part

  • dimensions is of right type (int) and length (3)

  • voxel_sizes is of right type (float) and length (3)

  • voxel_order is of right type (str) and length (3)

The listed parameters are what is expected, provide something else and this function should fail (cover common mistakes).

Parameters
affine: ndarray (4,4)

Tranformation of VOX to RASMM

dimensions: ndarray (3,), int16

Volume shape for each axis

voxel_sizes: ndarray (3,), float32

Size of voxel for each axis

voxel_order: string

Typically ‘RAS’ or ‘LPS’

Returns
outputbool

Does the input represent a valid ‘state’ of spatial attribute

set_sft_logger_level

dipy.io.stateful_tractogram.set_sft_logger_level(log_level)

Change the logger of the StatefulTractogram to one on the following: DEBUG, INFO, WARNING, CRITICAL, ERROR

Parameters
log_levelstr

Log level for the StatefulTractogram only

Dpy

class dipy.io.streamline.Dpy(fname, mode='r', compression=0)

Bases: object

Methods

read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

close

version

__init__(self, fname, mode='r', compression=0)

Advanced storage system for tractography based on HDF5

Parameters
fnamestr, full filename
mode‘r’ read

‘w’ write ‘r+’ read and write only if file already exists

compression0 no compression to 9 maximum compression

Examples

>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
...     fd,fname = mkstemp()
...     fname += '.dpy'#add correct extension
...     dpw = Dpy(fname,'w')
...     A=np.ones((5,3))
...     B=2*A.copy()
...     C=3*A.copy()
...     dpw.write_track(A)
...     dpw.write_track(B)
...     dpw.write_track(C)
...     dpw.close()
...     dpr = Dpy(fname,'r')
...     dpr.read_track()
...     dpr.read_track()
...     dpr.read_tracksi([0, 1, 2, 0, 0, 2])
...     dpr.close()
...     os.remove(fname) #delete file from disk
>>> dpy_example()
close(self)
read_track(self)

read one track each time

read_tracks(self)

read the entire tractography

read_tracksi(self, indices)

read tracks with specific indices

version(self)
write_track(self, track)

write on track each time

write_tracks(self, tracks)

write many tracks together

Origin

class dipy.io.streamline.Origin

Bases: enum.Enum

Enum to simplify future change to convention

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

NIFTI = 'center'
TRACKVIS = 'corner'

Space

class dipy.io.streamline.Space

Bases: enum.Enum

Enum to simplify future change to convention

__init__(self, /, *args, **kwargs)

Initialize self. See help(type(self)) for accurate signature.

RASMM = 'rasmm'
VOX = 'vox'
VOXMM = 'voxmm'

StatefulTractogram

class dipy.io.streamline.StatefulTractogram(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)

Bases: object

Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.

Attributes
affine

Getter for the reference affine

data_per_point

Getter for data_per_point

data_per_streamline

Getter for data_per_streamline

dimensions

Getter for the reference dimensions

origin

Getter for origin standard

space

Getter for the current space

space_attributes

Getter for spatial attribute

streamlines

Partially safe getter for streamlines

voxel_order

Getter for the reference voxel order

voxel_sizes

Getter for the reference voxel sizes

Methods

are_compatible(sft_1, sft_2)

Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency

compute_bounding_box(self)

Compute the bounding box of the streamlines in their current state

from_sft(streamlines, sft[, data_per_point, …])

Create an instance of StatefulTractogram from another instance of StatefulTractogram.

get_data_per_point_keys(self)

Return a list of the data_per_point attribute names

get_data_per_streamline_keys(self)

Return a list of the data_per_streamline attribute names

get_streamlines_copy(self)

Safe getter for streamlines (for slicing)

is_bbox_in_vox_valid(self)

Verify that the bounding box is valid in voxel space.

remove_invalid_streamlines(self[, epsilon])

Remove streamlines with invalid coordinates from the object.

to_center(self)

Safe function to shift streamlines so the center of voxel is the origin

to_corner(self)

Safe function to shift streamlines so the corner of voxel is the origin

to_origin(self, target_origin)

Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)

to_rasmm(self)

Safe function to transform streamlines and update state

to_space(self, target_space)

Safe function to transform streamlines to a particular space using an enum and update state

to_vox(self)

Safe function to transform streamlines and update state

to_voxmm(self)

Safe function to transform streamlines and update state

__init__(self, streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)

Create a strict, state-aware, robust tractogram

Parameters
streamlineslist or ArraySequence

Streamlines of the tractogram

referenceNifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attributes. Typically a nifti-related object from the native diffusion used for streamlines generation

spaceEnum (dipy.io.stateful_tractogram.Space)

Current space in which the streamlines are (vox, voxmm or rasmm) After tracking the space is VOX, after loading with nibabel the space is RASMM

originEnum (dipy.io.stateful_tractogram.Origin), optional

Current origin in which the streamlines are (center or corner) After loading with nibabel the origin is CENTER

data_per_pointdict, optional

Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i

data_per_streamlinedict, optional

Dictionary in which each key has X items X being the number of streamlines

Notes

Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.

Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.

In a case of manipulation not allowed by this object, use Nibabel directly and be careful.

property affine

Getter for the reference affine

static are_compatible(sft_1, sft_2)

Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency

compute_bounding_box(self)

Compute the bounding box of the streamlines in their current state

Returns
outputndarray

8 corners of the XYZ aligned box, all zeros if no streamlines

property data_per_point

Getter for data_per_point

property data_per_streamline

Getter for data_per_streamline

property dimensions

Getter for the reference dimensions

static from_sft(streamlines, sft, data_per_point=None, data_per_streamline=None)

Create an instance of StatefulTractogram from another instance of StatefulTractogram.

Parameters
streamlineslist or ArraySequence

Streamlines of the tractogram

sftStatefulTractgram,

The other StatefulTractgram to copy the space_attribute AND state from.

data_per_pointdict, optional

Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i

data_per_streamlinedict, optional

Dictionary in which each key has X items X being the number of streamlines

—–
get_data_per_point_keys(self)

Return a list of the data_per_point attribute names

get_data_per_streamline_keys(self)

Return a list of the data_per_streamline attribute names

get_streamlines_copy(self)

Safe getter for streamlines (for slicing)

is_bbox_in_vox_valid(self)

Verify that the bounding box is valid in voxel space. Negative coordinates or coordinates above the volume dimensions are considered invalid in voxel space.

Returns
outputbool

Are the streamlines within the volume of the associated reference

property origin

Getter for origin standard

remove_invalid_streamlines(self, epsilon=0.001)

Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero

Parameters
epsilonfloat (optional)

Epsilon value for the bounding box verification. Default is 1e-6.

Returns
outputtuple

Tuple of two list, indices_to_remove, indices_to_keep

property space

Getter for the current space

property space_attributes

Getter for spatial attribute

property streamlines

Partially safe getter for streamlines

to_center(self)

Safe function to shift streamlines so the center of voxel is the origin

to_corner(self)

Safe function to shift streamlines so the corner of voxel is the origin

to_origin(self, target_origin)

Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)

to_rasmm(self)

Safe function to transform streamlines and update state

to_space(self, target_space)

Safe function to transform streamlines to a particular space using an enum and update state

to_vox(self)

Safe function to transform streamlines and update state

to_voxmm(self)

Safe function to transform streamlines and update state

property voxel_order

Getter for the reference voxel order

property voxel_sizes

Getter for the reference voxel sizes

Tractogram

class dipy.io.streamline.Tractogram(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)

Bases: object

Container for streamlines and their data information.

Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) [1]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.

References

[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces [2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space

Attributes
streamlinesArraySequence object

Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).

data_per_streamlinePerArrayDict object

Dictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).

data_per_pointPerArraySequenceDict object

Dictionary where the items are (str, ArraySequence). Each key represents a piece of information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number values to store for that particular piece of information \(i\).

Methods

apply_affine(self, affine[, lazy])

Applies an affine transformation on the points of each streamline.

copy(self)

Returns a copy of this Tractogram object.

extend(self, other)

Appends the data of another Tractogram.

to_world(self[, lazy])

Brings the streamlines to world space (i.e.

__init__(self, streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)
Parameters
streamlinesiterable of ndarrays or ArraySequence, optional

Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).

data_per_streamlinedict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).

data_per_pointdict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).

affine_to_rasmmndarray of shape (4, 4) or None, optional

Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.

property affine_to_rasmm

Affine bringing streamlines in this tractogram to RAS+mm.

apply_affine(self, affine, lazy=False)

Applies an affine transformation on the points of each streamline.

If lazy is not specified, this is performed in-place.

Parameters
affinendarray of shape (4, 4)

Transformation that will be applied to every streamline.

lazy{False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns
tractogramTractogram or LazyTractogram object

Tractogram where the streamlines have been transformed according to the given affine transformation. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

copy(self)

Returns a copy of this Tractogram object.

property data_per_point
property data_per_streamline
extend(self, other)

Appends the data of another Tractogram.

Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.

Parameters
otherTractogram object

Its data will be appended to the data of this tractogram.

Returns
None

Notes

The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.

property streamlines
to_world(self, lazy=False)

Brings the streamlines to world space (i.e. RAS+ and mm).

If lazy is not specified, this is performed in-place.

Parameters
lazy{False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns
tractogramTractogram or LazyTractogram object

Tractogram where the streamlines have been sent to world space. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

create_tractogram_header

dipy.io.streamline.create_tractogram_header(tractogram_type, affine, dimensions, voxel_sizes, voxel_order)

Write a standard trk/tck header from spatial attribute

deepcopy

dipy.io.streamline.deepcopy(x, memo=None, _nil=[])

Deep copy operation on arbitrary Python objects.

See the module’s __doc__ string for more info.

detect_format

dipy.io.streamline.detect_format(fileobj)

Returns the StreamlinesFile object guessed from the file-like object.

Parameters
fileobjstring or file-like object

If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)

Returns
tractogram_fileTractogramFile class

The class type guessed from the content of fileobj.

is_header_compatible

dipy.io.streamline.is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

Parameters
reference_1Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

reference_2Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

Returns
outputbool

Does all the spatial attribute match

load_dpy

dipy.io.streamline.load_dpy(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram of the .dpy format

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_fib

dipy.io.streamline.load_fib(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram of the .fib format

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_generator

dipy.io.streamline.load_generator(ttype)

Generate a loading function that performs a file extension check to restrict the user to a single file format.

Parameters
ttypestring

Extension of the file format that requires a loader

Returns
——-
outputfunction

Function (load_tractogram) that handle only one file format

load_tck

dipy.io.streamline.load_tck(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram of the .tck format

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_tractogram

dipy.io.streamline.load_tractogram(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram from any format (trk, tck, vtk, fib, dpy)

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_trk

dipy.io.streamline.load_trk(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram of the .trk format

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_vtk

dipy.io.streamline.load_vtk(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)

Load the stateful tractogram of the .vtk format

Parameters
filenamestring

Filename with valid extension

referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation

to_spaceEnum (dipy.io.stateful_tractogram.Space)

Space to which the streamlines will be transformed after loading

to_originEnum (dipy.io.stateful_tractogram.Origin)
Origin to which the streamlines will be transformed after loading

NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

trk_header_checkbool

Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded

Returns
outputStatefulTractogram

The tractogram to load (must have been saved properly)

load_vtk_streamlines

dipy.io.streamline.load_vtk_streamlines(filename, to_lps=True)

Load streamlines from vtk polydata.

Load formats can be VTK, FIB

Parameters
filenamestring

input filename (.vtk or .fib)

to_lpsbool

Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain

Returns
outputlist

list of 2D arrays

save_dpy

dipy.io.streamline.save_dpy(sft, filename, bbox_valid_check=True)

Save the stateful tractogram of the .dpy format

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_fib

dipy.io.streamline.save_fib(sft, filename, bbox_valid_check=True)

Save the stateful tractogram of the .fib format

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_generator

dipy.io.streamline.save_generator(ttype)

Generate a saving function that performs a file extension check to restrict the user to a single file format.

Parameters
ttypestring

Extension of the file format that requires a saver

Returns
——-
outputfunction

Function (save_tractogram) that handle only one file format

save_tck

dipy.io.streamline.save_tck(sft, filename, bbox_valid_check=True)

Save the stateful tractogram of the .tck format

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_tractogram

dipy.io.streamline.save_tractogram(sft, filename, bbox_valid_check=True)

Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy)

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_trk

dipy.io.streamline.save_trk(sft, filename, bbox_valid_check=True)

Save the stateful tractogram of the .trk format

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_vtk

dipy.io.streamline.save_vtk(sft, filename, bbox_valid_check=True)

Save the stateful tractogram of the .vtk format

Parameters
sftStatefulTractogram

The stateful tractogram to save

filenamestring

Filename with valid extension

bbox_valid_checkbool

Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.

Returns
outputbool

True if the saving operation was successful

save_vtk_streamlines

dipy.io.streamline.save_vtk_streamlines(streamlines, filename, to_lps=True, binary=False)

Save streamlines as vtk polydata to a supported format file.

File formats can be OBJ, VTK, FIB, PLY, STL and XML

Parameters
streamlineslist

list of 2D arrays or ArraySequence

filenamestring

output filename (.obj, .vtk, .fib, .ply, .stl and .xml)

to_lpsbool

Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain

binarybool

save the file as binary

Nifti1Image

class dipy.io.utils.Nifti1Image(dataobj, affine, header=None, extra=None, file_map=None)

Bases: nibabel.nifti1.Nifti1Pair, nibabel.filebasedimages.SerializableImage

Class for single file NIfTI1 format image

Attributes
affine
dataobj
header
in_memory

True when any array data is in memory cache

ndim
shape
slicer

Slicer object that returns cropped and subsampled images

Methods

ImageArrayProxy

alias of nibabel.arrayproxy.ArrayProxy

ImageSlicer

alias of nibabel.spatialimages.SpatialFirstSlicer

as_reoriented(self, ornt)

Apply an orientation change and return a new image

filespec_to_file_map(filespec)

Make file_map for this class from filename filespec

filespec_to_files(filespec)

filespec_to_files class method is deprecated.

from_bytes(bytestring)

Construct image from a byte string

from_file_map(file_map[, mmap, keep_file_open])

Class method to create image from mapping in file_map

from_filename(filename[, mmap, keep_file_open])

Class method to create image from filename filename

from_files(file_map)

from_files class method is deprecated.

from_image(img)

Class method to create new instance of own class from img

get_affine(self)

Get affine from image

get_data(self[, caching])

Return image data from image with any necessary scaling applied

get_fdata(self[, caching, dtype])

Return floating point image data with necessary scaling applied

get_filename(self)

Fetch the image filename

get_header(self)

Get header from image

get_qform(self[, coded])

Return 4x4 affine matrix from qform parameters in header

get_sform(self[, coded])

Return 4x4 affine matrix from sform parameters in header

get_shape(self)

Return shape for image

header_class

alias of Nifti1Header

instance_to_filename(img, filename)

Save img in our own format, to name implied by filename

load(filename[, mmap, keep_file_open])

Class method to create image from filename filename

make_file_map([mapping])

Class method to make files holder for this image type

orthoview(self)

Plot the image using OrthoSlicer3D

path_maybe_image(filename[, sniff, sniff_max])

Return True if filename may be image matching this class

set_filename(self, filename)

Sets the files in the object from a given filename

set_qform(self, affine[, code, strip_shears])

Set qform header values from 4x4 affine

set_sform(self, affine[, code])

Set sform transform from 4x4 affine

to_bytes(self)

Return a bytes object with the contents of the file that would be written if the image were saved.

to_file_map(self[, file_map])

Write image to file_map or contained self.file_map

to_filename(self, filename)

Write image to files implied by filename string

to_files(self[, file_map])

to_files method is deprecated.

to_filespec(self, filename)

to_filespec method is deprecated.

uncache(self)

Delete any cached read of data from proxied data

update_header(self)

Harmonize header with image data and affine

get_data_dtype

set_data_dtype

__init__(self, dataobj, affine, header=None, extra=None, file_map=None)

Initialize image

The image is a combination of (array-like, affine matrix, header), with optional metadata in extra, and filename / file-like objects contained in the file_map mapping.

Parameters
dataobjobject

Object containg image data. It should be some object that retuns an array from np.asanyarray. It should have a shape attribute or property

affineNone or (4,4) array-like

homogenous affine giving relationship between voxel coordinates and world coordinates. Affine can also be None. In this case, obj.affine also returns None, and the affine as written to disk will depend on the file format.

headerNone or mapping or header instance, optional

metadata for this image format

extraNone or mapping, optional

metadata to associate with image that cannot be stored in the metadata of this image type

file_mapmapping, optional

mapping giving file information for this image format

Notes

If both a header and an affine are specified, and the affine does not match the affine that is in the header, the affine will be used, but the sform_code and qform_code fields in the header will be re-initialised to their default values. This is performed on the basis that, if you are changing the affine, you are likely to be changing the space to which the affine is pointing. The set_sform() and set_qform() methods can be used to update the codes after an image has been created - see those methods, and the manual for more details.

files_types = (('image', '.nii'),)
header_class

alias of Nifti1Header

update_header(self)

Harmonize header with image data and affine

valid_exts = ('.nii',)

create_nifti_header

dipy.io.utils.create_nifti_header(affine, dimensions, voxel_sizes)

Write a standard nifti header from spatial attribute

create_tractogram_header

dipy.io.utils.create_tractogram_header(tractogram_type, affine, dimensions, voxel_sizes, voxel_order)

Write a standard trk/tck header from spatial attribute

decfa

dipy.io.utils.decfa(img_orig, scale=False)

Create a nifti-compliant directional-encoded color FA image.

Parameters
img_origNifti1Image class instance.

Contains encoding of the DEC FA image with a 4D volume of data, where the elements on the last dimension represent R, G and B components.

scale: bool.

Whether to scale the incoming data from the 0-1 to the 0-255 range expected in the output.

Returns
imgNifti1Image class instance with dtype set to store tuples of

uint8 in (R, G, B) order.

Notes

For a description of this format, see:

https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html

decfa_to_float

dipy.io.utils.decfa_to_float(img_orig)

Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution.

Parameters
img_origNifti1Image class instance.

Contains encoding of the DEC FA image with a 3D volume of data, where each element is a (R, G, B) tuple in uint8.

Returns
imgNifti1Image class instance with float dtype.

Notes

For a description of this format, see:

https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html

detect_format

dipy.io.utils.detect_format(fileobj)

Returns the StreamlinesFile object guessed from the file-like object.

Parameters
fileobjstring or file-like object

If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)

Returns
tractogram_fileTractogramFile class

The class type guessed from the content of fileobj.

get_reference_info

dipy.io.utils.get_reference_info(reference)

Will compare the spatial attribute of 2 references

Parameters
referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or

trk.header (dict) Reference that provides the spatial attribute.

Returns
outputtuple
  • affine ndarray (4,4), np.float32, tranformation of VOX to RASMM

  • dimensions ndarray (3,), int16, volume shape for each axis

  • voxel_sizes ndarray (3,), float32, size of voxel for each axis

  • voxel_order, string, Typically ‘RAS’ or ‘LPS’

is_header_compatible

dipy.io.utils.is_header_compatible(reference_1, reference_2)

Will compare the spatial attribute of 2 references

Parameters
reference_1Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

reference_2Nifti or Trk filename, Nifti1Image or TrkFile,

Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.

Returns
outputbool

Does all the spatial attribute match

is_reference_info_valid

dipy.io.utils.is_reference_info_valid(affine, dimensions, voxel_sizes, voxel_order)

Validate basic data type and value of spatial attribute.

Does not ensure that voxel_sizes and voxel_order are self-coherent with the affine. Only verify the following:

  • affine is of the right type (float) and dimension (4,4)

  • affine contain values in the rotation part

  • dimensions is of right type (int) and length (3)

  • voxel_sizes is of right type (float) and length (3)

  • voxel_order is of right type (str) and length (3)

The listed parameters are what is expected, provide something else and this function should fail (cover common mistakes).

Parameters
affine: ndarray (4,4)

Tranformation of VOX to RASMM

dimensions: ndarray (3,), int16

Volume shape for each axis

voxel_sizes: ndarray (3,), float32

Size of voxel for each axis

voxel_order: string

Typically ‘RAS’ or ‘LPS’

Returns
outputbool

Does the input represent a valid ‘state’ of spatial attribute

make5d

dipy.io.utils.make5d(input)

reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession

nifti1_symmat

dipy.io.utils.nifti1_symmat(image_data, *args, **kwargs)

Returns a Nifti1Image with a symmetric matrix intent

Parameters
image_dataarray-like

should have lower triangular elements of a symmetric matrix along the last dimension

all other arguments and keywords are passed to Nifti1Image
Returns
imageNifti1Image

5d, extra dimensions addes before the last. Has symmetric matrix intent code

optional_package

dipy.io.utils.optional_package(name, trip_msg=None)

Return package-like thing and module setup for package name

Parameters
namestr

package name

trip_msgNone or str

message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.

Returns
pkg_likemodule or TripWire instance

If we can import the package, return it. Otherwise return an object raising an error when accessed

have_pkgbool

True if import for package was successful, false otherwise

module_setupfunction

callable usually set as setup_module in calling namespace, to allow skipping tests.

Examples

Typical use would be something like this at the top of a module using an optional package:

>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')

Of course in this case the package doesn’t exist, and so, in the module:

>>> have_pkg
False

and

>>> pkg.some_function() 
Traceback (most recent call last):
    ...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError

If the module does exist - we get the module

>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True

Or a submodule if that’s what we asked for

>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True

save_buan_profiles_hdf5

dipy.io.utils.save_buan_profiles_hdf5(fname, dt)

Saves the given input dataframe to .h5 file

Parameters
fnamestring

file name for saving the hdf5 file

dtPandas DataFrame

DataFrame to be saved as .h5 file

load_polydata

dipy.io.vtk.load_polydata(file_name)

Load a vtk polydata to a supported format file.

Supported file formats are OBJ, VTK, FIB, PLY, STL and XML

Parameters
file_namestring
Returns
outputvtkPolyData

load_vtk_streamlines

dipy.io.vtk.load_vtk_streamlines(filename, to_lps=True)

Load streamlines from vtk polydata.

Load formats can be VTK, FIB

Parameters
filenamestring

input filename (.vtk or .fib)

to_lpsbool

Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain

Returns
outputlist

list of 2D arrays

optional_package

dipy.io.vtk.optional_package(name, trip_msg=None)

Return package-like thing and module setup for package name

Parameters
namestr

package name

trip_msgNone or str

message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.

Returns
pkg_likemodule or TripWire instance

If we can import the package, return it. Otherwise return an object raising an error when accessed

have_pkgbool

True if import for package was successful, false otherwise

module_setupfunction

callable usually set as setup_module in calling namespace, to allow skipping tests.

Examples

Typical use would be something like this at the top of a module using an optional package:

>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')

Of course in this case the package doesn’t exist, and so, in the module:

>>> have_pkg
False

and

>>> pkg.some_function() 
Traceback (most recent call last):
    ...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError

If the module does exist - we get the module

>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True

Or a submodule if that’s what we asked for

>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True

save_polydata

dipy.io.vtk.save_polydata(polydata, file_name, binary=False, color_array_name=None)

Save a vtk polydata to a supported format file.

Save formats can be VTK, FIB, PLY, STL and XML.

Parameters
polydatavtkPolyData
file_namestring

save_vtk_streamlines

dipy.io.vtk.save_vtk_streamlines(streamlines, filename, to_lps=True, binary=False)

Save streamlines as vtk polydata to a supported format file.

File formats can be OBJ, VTK, FIB, PLY, STL and XML

Parameters
streamlineslist

list of 2D arrays or ArraySequence

filenamestring

output filename (.obj, .vtk, .fib, .ply, .stl and .xml)

to_lpsbool

Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain

binarybool

save the file as binary

setup_module

dipy.io.vtk.setup_module()

transform_streamlines

dipy.io.vtk.transform_streamlines(streamlines, mat, in_place=False)

Apply affine transformation to streamlines

Parameters
streamlinesStreamlines

Streamlines object

matarray, (4, 4)

transformation matrix

in_placebool

If True then change data in place. Be careful changes input streamlines.

Returns
new_streamlinesStreamlines

Sequence transformed 2D ndarrays of shape[-1]==3