io
io.bvectxt
io.dpy
io.gradients
io.image
io.peaks
io.pickles
io.stateful_tractogram
io.streamline
io.utils
io.vtk
Dpy
Dpy
Streamlines
InTemporaryDirectory
PeaksAndMetrics
Sphere
Origin
PerArrayDict
PerArraySequenceDict
Space
StatefulTractogram
Streamlines
Tractogram
product
Dpy
Origin
Space
StatefulTractogram
Tractogram
Nifti1Image
io
|
Methods |
|
Load object from pickle file fname. |
|
Returns an array representation of an ornt string |
|
Returns a string representation of a 3d ornt |
|
Calculates the mapping needing to get from orn1 to orn2 |
|
Read b-values and b-vectors from disk. |
|
Read gradient table information from a pair of files with extentions .bvec and .bval. |
|
|
|
Changes the orientation of a gradients or other vectors |
|
Save dix to fname as pickle. |
io.bvectxt
|
Returns an array representation of an ornt string |
|
Returns a string representation of a 3d ornt |
|
Calculates the mapping needing to get from orn1 to orn2 |
|
Read gradient table information from a pair of files with extentions .bvec and .bval. |
|
|
|
Changes the orientation of a gradients or other vectors |
|
Split the extension from a pathname. |
io.dpy
A class for handling large tractography datasets.
It is built using the h5py which in turn implement key features of the HDF5 (hierachical data format) API [1].
|
Methods |
alias of |
io.gradients
|
Create, return, and change directory to a temporary directory |
|
Read b-values and b-vectors from disk. |
|
Split the extension from a pathname. |
io.image
|
Load data and other information from a nifti file. |
|
Load only the data array from a nifti file. |
|
Save a data array into a nifti file. |
|
Save Quality Assurance metrics. |
io.peaks
|
|
|
Points on the unit sphere. |
|
Load a PeaksAndMetrics HDF5 file (PAM5) |
|
Save SH, directions, indices and values of peaks to Nifti. |
Reshape peaks for visualization. |
|
|
Save a data array into a nifti file. |
|
Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5). |
io.pickles
Load and save pickles
|
Load object from pickle file fname. |
|
Save dix to fname as pickle. |
io.stateful_tractogram
Enum to simplify future change to convention |
|
|
Dictionary for which key access can do slicing on the values. |
|
Dictionary for which key access can do slicing on the values. |
Enum to simplify future change to convention |
|
|
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). |
alias of |
|
|
Container for streamlines and their data information. |
product(*iterables, repeat=1) –> product object |
|
|
Apply affine matrix aff to points pts |
bisect_right(a, x[, lo[, hi]]) -> index |
|
|
Deep copy operation on arbitrary Python objects. |
|
Will compare the spatial attribute of 2 references |
|
Will compare the spatial attribute of 2 references |
|
Validate basic data type and value of spatial attribute. |
|
Change the logger of the StatefulTractogram to one on the following: DEBUG, INFO, WARNING, CRITICAL, ERROR |
io.streamline
|
Methods |
Enum to simplify future change to convention |
|
Enum to simplify future change to convention |
|
|
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). |
|
Container for streamlines and their data information. |
|
Write a standard trk/tck header from spatial attribute |
|
Deep copy operation on arbitrary Python objects. |
|
Returns the StreamlinesFile object guessed from the file-like object. |
|
Will compare the spatial attribute of 2 references |
|
Load the stateful tractogram of the .dpy format |
|
Load the stateful tractogram of the .fib format |
|
Generate a loading function that performs a file extension check to restrict the user to a single file format. |
|
Load the stateful tractogram of the .tck format |
|
Load the stateful tractogram from any format (trk, tck, vtk, fib, dpy) |
|
Load the stateful tractogram of the .trk format |
|
Load the stateful tractogram of the .vtk format |
|
Load streamlines from vtk polydata. |
|
Save the stateful tractogram of the .dpy format |
|
Save the stateful tractogram of the .fib format |
|
Generate a saving function that performs a file extension check to restrict the user to a single file format. |
|
Save the stateful tractogram of the .tck format |
|
Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy) |
|
Save the stateful tractogram of the .trk format |
|
Save the stateful tractogram of the .vtk format |
|
Save streamlines as vtk polydata to a supported format file. |
io.utils
Utility functions for file formats
|
Class for single file NIfTI1 format image |
|
Write a standard nifti header from spatial attribute |
|
Write a standard trk/tck header from spatial attribute |
|
Create a nifti-compliant directional-encoded color FA image. |
|
Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution. |
|
Returns the StreamlinesFile object guessed from the file-like object. |
|
Will compare the spatial attribute of 2 references |
|
Will compare the spatial attribute of 2 references |
|
Validate basic data type and value of spatial attribute. |
|
reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession |
|
Returns a Nifti1Image with a symmetric matrix intent |
|
Return package-like thing and module setup for package name |
|
Helper function that handles inputs that can be paths, nifti img or arrays |
|
Saves the given input dataframe to .h5 file |
io.vtk
|
Load a vtk polydata to a supported format file. |
|
Load streamlines from vtk polydata. |
|
Return package-like thing and module setup for package name |
|
Save a vtk polydata to a supported format file. |
|
Save streamlines as vtk polydata to a supported format file. |
|
Apply affine transformation to streamlines |
Dpy
dipy.io.
Dpy
(fname, mode='r', compression=0)Bases: object
Methods
read one track each time |
|
read the entire tractography |
|
|
read tracks with specific indices |
|
write on track each time |
|
write many tracks together |
close |
|
version |
__init__
(fname, mode='r', compression=0)Advanced storage system for tractography based on HDF5
‘w’ write ‘r+’ read and write only if file already exists
Examples
>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
... fd,fname = mkstemp()
... fname += '.dpy'#add correct extension
... dpw = Dpy(fname,'w')
... A=np.ones((5,3))
... B=2*A.copy()
... C=3*A.copy()
... dpw.write_track(A)
... dpw.write_track(B)
... dpw.write_track(C)
... dpw.close()
... dpr = Dpy(fname,'r')
... dpr.read_track()
... dpr.read_track()
... dpr.read_tracksi([0, 1, 2, 0, 0, 2])
... dpr.close()
... os.remove(fname) #delete file from disk
>>> dpy_example()
dipy.io.
read_bvals_bvecs
(fbvals, fbvecs)Read b-values and b-vectors from disk.
Full path to file with b-values. None to not read bvals.
Full path of file with b-vectors. None to not read bvecs.
Notes
Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).
dipy.io.
read_bvec_file
(filename, atol=0.001)Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.
The path to the either the bvec or bval file
The tolorance used to check all the gradient directions are normalized. Defult is .001
dipy.io.
reorient_vectors
(input, current_ornt, new_ornt, axis=0)Changes the orientation of a gradients or other vectors
Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.
R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior
Examples
>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
[2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1, 1],
[-1, -2, 3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
[-1, -2],
[ 1, 3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
[ 1, 3],
[-1, -2]])
dipy.io.
save_pickle
(fname, dix)Save dix to fname as pickle.
filename to save object e.g. a dictionary
dictionary or other object
See also
Examples
>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)
We remove the temporary file we created for neatness
>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)
dipy.io.bvectxt.
read_bvec_file
(filename, atol=0.001)Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.
The path to the either the bvec or bval file
The tolorance used to check all the gradient directions are normalized. Defult is .001
dipy.io.bvectxt.
reorient_vectors
(input, current_ornt, new_ornt, axis=0)Changes the orientation of a gradients or other vectors
Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.
R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior
Examples
>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
[2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1, 1],
[-1, -2, 3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
[-1, -2],
[ 1, 3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
[ 1, 3],
[-1, -2]])
Dpy
dipy.io.dpy.
Dpy
(fname, mode='r', compression=0)Bases: object
Methods
read one track each time |
|
read the entire tractography |
|
|
read tracks with specific indices |
|
write on track each time |
|
write many tracks together |
close |
|
version |
__init__
(fname, mode='r', compression=0)Advanced storage system for tractography based on HDF5
‘w’ write ‘r+’ read and write only if file already exists
Examples
>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
... fd,fname = mkstemp()
... fname += '.dpy'#add correct extension
... dpw = Dpy(fname,'w')
... A=np.ones((5,3))
... B=2*A.copy()
... C=3*A.copy()
... dpw.write_track(A)
... dpw.write_track(B)
... dpw.write_track(C)
... dpw.close()
... dpr = Dpy(fname,'r')
... dpr.read_track()
... dpr.read_track()
... dpr.read_tracksi([0, 1, 2, 0, 0, 2])
... dpr.close()
... os.remove(fname) #delete file from disk
>>> dpy_example()
InTemporaryDirectory
dipy.io.gradients.
InTemporaryDirectory
(suffix='', prefix='tmp', dir=None)Bases: nibabel.tmpdirs.TemporaryDirectory
Create, return, and change directory to a temporary directory
Examples
>>> import os
>>> my_cwd = os.getcwd()
>>> with InTemporaryDirectory() as tmpdir:
... _ = open('test.txt', 'wt').write('some text')
... assert os.path.isfile('test.txt')
... assert os.path.isfile(os.path.join(tmpdir, 'test.txt'))
>>> os.path.exists(tmpdir)
False
>>> os.getcwd() == my_cwd
True
Methods
cleanup |
dipy.io.gradients.
read_bvals_bvecs
(fbvals, fbvecs)Read b-values and b-vectors from disk.
Full path to file with b-values. None to not read bvals.
Full path of file with b-vectors. None to not read bvecs.
Notes
Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).
dipy.io.image.
load_nifti
(fname, return_img=False, return_voxsize=False, return_coords=False, as_ndarray=True)Load data and other information from a nifti file.
Full path to a nifti file.
Whether to return the nibabel nifti img object. Default: False
Whether to return the nifti header zooms. Default: False
Whether to return the nifti header aff2axcodes. Default: False
convert nibabel ArrayProxy to a numpy.ndarray. If you want to save memory and delay this casting, just turn this option to False (default: True)
See also
dipy.io.image.
load_nifti_data
(fname, as_ndarray=True)Load only the data array from a nifti file.
Full path to the file.
convert nibabel ArrayProxy to a numpy.ndarray. If you want to save memory and delay this casting, just turn this option to False (default: True)
See also
dipy.io.image.
save_nifti
(fname, data, affine, hdr=None)Save a data array into a nifti file.
The full path to the file to be saved.
The array with the data to save.
The affine transform associated with the file.
May contain additional information to store in the file header.
PeaksAndMetrics
dipy.io.peaks.
PeaksAndMetrics
Bases: dipy.reconst.eudx_direction_getter.EuDXDirectionGetter
Methods
|
The best starting directions for fiber tracking from point |
get_direction |
Sphere
dipy.io.peaks.
Sphere
(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)Bases: object
Points on the unit sphere.
The sphere can be constructed using one of three conventions:
Sphere(x, y, z)
Sphere(xyz=xyz)
Sphere(theta=theta, phi=phi)
Vertices as x-y-z coordinates.
Vertices as spherical coordinates. Theta and phi are the inclination and azimuth angles respectively.
Vertices as x-y-z coordinates.
Indices into vertices that form triangular faces. If unspecified, the faces are computed using a Delaunay triangulation.
Edges between vertices. If unspecified, the edges are derived from the faces.
Methods
|
Find the index of the vertex in the Sphere closest to the input vector |
|
Subdivides each face of the sphere into four new faces. |
edges |
|
faces |
|
vertices |
__init__
(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)Initialize self. See help(type(self)) for accurate signature.
find_closest
(xyz)Find the index of the vertex in the Sphere closest to the input vector
A unit vector
The index into the Sphere.vertices array that gives the closest vertex (in angle).
subdivide
(n=1)Subdivides each face of the sphere into four new faces.
New vertices are created at a, b, and c. Then each face [x, y, z] is divided into faces [x, a, c], [y, a, b], [z, b, c], and [a, b, c].
y
/\
/ \
a/____\b
/\ /\
/ \ / \
/____\/____\
x c z
The number of subdivisions to preform.
The subdivided sphere.
dipy.io.peaks.
reshape_peaks_for_visualization
(peaks)Reshape peaks for visualization.
Reshape and convert to float32 a set of peaks for visualisation with mrtrix or the fibernavigator.
The peaks to be reshaped and converted to float32.
dipy.io.peaks.
save_nifti
(fname, data, affine, hdr=None)Save a data array into a nifti file.
The full path to the file to be saved.
The array with the data to save.
The affine transform associated with the file.
May contain additional information to store in the file header.
dipy.io.peaks.
save_peaks
(fname, pam, affine=None, verbose=False)Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).
Filename of PAM5 file
Object holding peak_dirs, shm_coeffs and other attributes
The 4x4 matrix transforming the date from native to world coordinates. PeaksAndMetrics should have that attribute but if not it can be provided here. Default None.
Print summary information about the saved file.
dipy.io.pickles.
save_pickle
(fname, dix)Save dix to fname as pickle.
filename to save object e.g. a dictionary
dictionary or other object
See also
Examples
>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)
We remove the temporary file we created for neatness
>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)
Origin
dipy.io.stateful_tractogram.
Origin
Bases: enum.Enum
Enum to simplify future change to convention
PerArrayDict
dipy.io.stateful_tractogram.
PerArrayDict
(n_rows=0, *args, **kwargs)Bases: nibabel.streamlines.tractogram.SliceableDataDict
Dictionary for which key access can do slicing on the values.
This container behaves like a standard dictionary but extends key access to allow keys for key access to be indices slicing into the contained ndarray values. The elements must also be ndarrays.
In addition, it makes sure the amount of data contained in those ndarrays matches the number of streamlines given at the instantiation of this instance.
Number of rows per value in each key, value pair or None for not specified.
Positional and keyword arguments, passed straight through the dict
constructor.
Methods
|
|
|
Appends the elements of another |
|
|
|
|
|
|
|
If key is not found, d is returned if given, otherwise KeyError is raised. |
|
as a 2-tuple; but raise KeyError if D is empty. |
|
|
|
If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v |
|
extend
(other)Appends the elements of another PerArrayDict
.
That is, for each entry in this dictionary, we append the elements coming from the other dictionary at the corresponding entry.
PerArrayDict
objectIts data will be appended to the data of this dictionary.
Notes
The keys in both dictionaries must be the same.
PerArraySequenceDict
dipy.io.stateful_tractogram.
PerArraySequenceDict
(n_rows=0, *args, **kwargs)Bases: nibabel.streamlines.tractogram.PerArrayDict
Dictionary for which key access can do slicing on the values.
This container behaves like a standard dictionary but extends key access to
allow keys for key access to be indices slicing into the contained ndarray
values. The elements must also be ArraySequence
.
In addition, it makes sure the amount of data contained in those array sequences matches the number of elements given at the instantiation of the instance.
Methods
|
|
|
Appends the elements of another |
|
|
|
|
|
|
|
If key is not found, d is returned if given, otherwise KeyError is raised. |
|
as a 2-tuple; but raise KeyError if D is empty. |
|
|
|
If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v |
|
Space
dipy.io.stateful_tractogram.
Space
Bases: enum.Enum
Enum to simplify future change to convention
StatefulTractogram
dipy.io.stateful_tractogram.
StatefulTractogram
(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)Bases: object
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.
affine
Getter for the reference affine
data_per_point
Getter for data_per_point
data_per_streamline
Getter for data_per_streamline
dimensions
Getter for the reference dimensions
origin
Getter for origin standard
space
Getter for the current space
space_attributes
Getter for spatial attribute
streamlines
Partially safe getter for streamlines
voxel_order
Getter for the reference voxel order
voxel_sizes
Getter for the reference voxel sizes
Methods
|
Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency |
Compute the bounding box of the streamlines in their current state |
|
|
Create an instance of StatefulTractogram from another instance of StatefulTractogram. |
Return a list of the data_per_point attribute names |
|
Return a list of the data_per_streamline attribute names |
|
Safe getter for streamlines (for slicing) |
|
Verify that the bounding box is valid in voxel space. |
|
|
Remove streamlines with invalid coordinates from the object. |
Safe function to shift streamlines so the center of voxel is the origin |
|
Safe function to shift streamlines so the corner of voxel is the origin |
|
|
Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner) |
|
Safe function to transform streamlines and update state |
|
Safe function to transform streamlines to a particular space using an enum and update state |
|
Safe function to transform streamlines and update state |
|
Safe function to transform streamlines and update state |
__init__
(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)Create a strict, state-aware, robust tractogram
Streamlines of the tractogram
Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attributes. Typically a nifti-related object from the native diffusion used for streamlines generation
Current space in which the streamlines are (vox, voxmm or rasmm) After tracking the space is VOX, after loading with nibabel the space is RASMM
Current origin in which the streamlines are (center or corner) After loading with nibabel the origin is CENTER
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
Dictionary in which each key has X items X being the number of streamlines
Notes
Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.
Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.
In a case of manipulation not allowed by this object, use Nibabel directly and be careful.
are_compatible
(sft_1, sft_2)Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency
compute_bounding_box
()Compute the bounding box of the streamlines in their current state
8 corners of the XYZ aligned box, all zeros if no streamlines
from_sft
(streamlines, sft, data_per_point=None, data_per_streamline=None)Create an instance of StatefulTractogram from another instance of StatefulTractogram.
Streamlines of the tractogram
The other StatefulTractgram to copy the space_attribute AND state from.
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
Dictionary in which each key has X items X being the number of streamlines
is_bbox_in_vox_valid
()Verify that the bounding box is valid in voxel space. Negative coordinates or coordinates above the volume dimensions are considered invalid in voxel space.
Are the streamlines within the volume of the associated reference
remove_invalid_streamlines
(epsilon=0.001)Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero
Epsilon value for the bounding box verification. Default is 1e-6.
Tuple of two list, indices_to_remove, indices_to_keep
to_origin
(target_origin)Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)
Tractogram
dipy.io.stateful_tractogram.
Tractogram
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)Bases: object
Container for streamlines and their data information.
Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) 1.
Moreover, when streamlines are mapped back to voxel space 2, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.
References
http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
ArraySequence
objectSequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
PerArrayDict
objectDictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).
PerArraySequenceDict
objectDictionary where the items are (str, ArraySequence
). Each key
represents a piece of information \(i\) to be kept alongside every point
of every streamline, and its associated value is an iterable of
ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points
for a particular streamline \(t\) and \(M_i\) is the number values to store
for that particular piece of information \(i\).
Methods
|
Applies an affine transformation on the points of each streamline. |
|
Returns a copy of this |
|
Appends the data of another |
|
Brings the streamlines to world space (i.e. |
__init__
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)ArraySequence
, optionalSequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).
Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.
apply_affine
(affine, lazy=False)Applies an affine transformation on the points of each streamline.
If lazy is not specified, this is performed in-place.
Transformation that will be applied to every streamline.
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines
are modified in-place.
Tractogram
or LazyTractogram
objectTractogram where the streamlines have been transformed according
to the given affine transformation. If the lazy option is true,
it returns a LazyTractogram
object, otherwise it returns a
reference to this Tractogram
object with updated
streamlines.
copy
()Returns a copy of this Tractogram
object.
extend
(other)Appends the data of another Tractogram
.
Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.
Tractogram
objectIts data will be appended to the data of this tractogram.
Notes
The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.
to_world
(lazy=False)Brings the streamlines to world space (i.e. RAS+ and mm).
If lazy is not specified, this is performed in-place.
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines
are modified in-place.
Tractogram
or LazyTractogram
objectTractogram where the streamlines have been sent to world space.
If the lazy option is true, it returns a LazyTractogram
object, otherwise it returns a reference to this
Tractogram
object with updated streamlines.
product
dipy.io.stateful_tractogram.
product
Bases: object
product(*iterables, repeat=1) –> product object
Cartesian product of input iterables. Equivalent to nested for-loops.
For example, product(A, B) returns the same as: ((x,y) for x in A for y in B). The leftmost iterators are in the outermost for-loop, so the output tuples cycle in a manner similar to an odometer (with the rightmost element changing on every iteration).
To compute the product of an iterable with itself, specify the number of repetitions with the optional repeat keyword argument. For example, product(A, repeat=4) means the same as product(A, A, A, A).
product(‘ab’, range(3)) –> (‘a’,0) (‘a’,1) (‘a’,2) (‘b’,0) (‘b’,1) (‘b’,2) product((0,1), (0,1), (0,1)) –> (0,0,0) (0,0,1) (0,1,0) (0,1,1) (1,0,0) …
dipy.io.stateful_tractogram.
apply_affine
(aff, pts)Apply affine matrix aff to points pts
Returns result of application of aff to the right of pts. The coordinate dimension of pts should be the last.
For the 3D case, aff will be shape (4,4) and pts will have final axis length 3 - maybe it will just be N by 3. The return value is the transformed points, in this case:
res = np.dot(aff[:3,:3], pts.T) + aff[:3,3:4]
transformed_pts = res.T
This routine is more general than 3D, in that aff can have any shape (N,N), and pts can have any shape, as long as the last dimension is for the coordinates, and is therefore length N-1.
Homogenous affine, for 3D points, will be 4 by 4. Contrary to first appearance, the affine will be applied on the left of pts.
Points, where the last dimension contains the coordinates of each point. For 3D, the last dimension will be length 3.
transformed points
Examples
>>> aff = np.array([[0,2,0,10],[3,0,0,11],[0,0,4,12],[0,0,0,1]])
>>> pts = np.array([[1,2,3],[2,3,4],[4,5,6],[6,7,8]])
>>> apply_affine(aff, pts)
array([[14, 14, 24],
[16, 17, 28],
[20, 23, 36],
[24, 29, 44]]...)
Just to show that in the simple 3D case, it is equivalent to:
>>> (np.dot(aff[:3,:3], pts.T) + aff[:3,3:4]).T
array([[14, 14, 24],
[16, 17, 28],
[20, 23, 36],
[24, 29, 44]]...)
But pts can be a more complicated shape:
>>> pts = pts.reshape((2,2,3))
>>> apply_affine(aff, pts)
array([[[14, 14, 24],
[16, 17, 28]],
[[20, 23, 36],
[24, 29, 44]]]...)
dipy.io.stateful_tractogram.
bisect
()bisect_right(a, x[, lo[, hi]]) -> index
Return the index where to insert item x in list a, assuming a is sorted.
The return value i is such that all e in a[:i] have e <= x, and all e in a[i:] have e > x. So if x already appears in the list, i points just beyond the rightmost x already there
Optional args lo (default 0) and hi (default len(a)) bound the slice of a to be searched.
dipy.io.stateful_tractogram.
get_reference_info
(reference)Will compare the spatial attribute of 2 references
trk.header (dict) Reference that provides the spatial attribute.
affine ndarray (4,4), np.float32, tranformation of VOX to RASMM
dimensions ndarray (3,), int16, volume shape for each axis
voxel_sizes ndarray (3,), float32, size of voxel for each axis
voxel_order, string, Typically ‘RAS’ or ‘LPS’
dipy.io.stateful_tractogram.
is_header_compatible
(reference_1, reference_2)Will compare the spatial attribute of 2 references
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Does all the spatial attribute match
dipy.io.stateful_tractogram.
is_reference_info_valid
(affine, dimensions, voxel_sizes, voxel_order)Validate basic data type and value of spatial attribute.
Does not ensure that voxel_sizes and voxel_order are self-coherent with the affine. Only verify the following:
affine is of the right type (float) and dimension (4,4)
affine contain values in the rotation part
dimensions is of right type (int) and length (3)
voxel_sizes is of right type (float) and length (3)
voxel_order is of right type (str) and length (3)
The listed parameters are what is expected, provide something else and this function should fail (cover common mistakes).
Tranformation of VOX to RASMM
Volume shape for each axis
Size of voxel for each axis
Typically ‘RAS’ or ‘LPS’
Does the input represent a valid ‘state’ of spatial attribute
Dpy
dipy.io.streamline.
Dpy
(fname, mode='r', compression=0)Bases: object
Methods
read one track each time |
|
read the entire tractography |
|
|
read tracks with specific indices |
|
write on track each time |
|
write many tracks together |
close |
|
version |
__init__
(fname, mode='r', compression=0)Advanced storage system for tractography based on HDF5
‘w’ write ‘r+’ read and write only if file already exists
Examples
>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
... fd,fname = mkstemp()
... fname += '.dpy'#add correct extension
... dpw = Dpy(fname,'w')
... A=np.ones((5,3))
... B=2*A.copy()
... C=3*A.copy()
... dpw.write_track(A)
... dpw.write_track(B)
... dpw.write_track(C)
... dpw.close()
... dpr = Dpy(fname,'r')
... dpr.read_track()
... dpr.read_track()
... dpr.read_tracksi([0, 1, 2, 0, 0, 2])
... dpr.close()
... os.remove(fname) #delete file from disk
>>> dpy_example()
Origin
dipy.io.streamline.
Origin
Bases: enum.Enum
Enum to simplify future change to convention
StatefulTractogram
dipy.io.streamline.
StatefulTractogram
(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)Bases: object
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.
affine
Getter for the reference affine
data_per_point
Getter for data_per_point
data_per_streamline
Getter for data_per_streamline
dimensions
Getter for the reference dimensions
origin
Getter for origin standard
space
Getter for the current space
space_attributes
Getter for spatial attribute
streamlines
Partially safe getter for streamlines
voxel_order
Getter for the reference voxel order
voxel_sizes
Getter for the reference voxel sizes
Methods
|
Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency |
Compute the bounding box of the streamlines in their current state |
|
|
Create an instance of StatefulTractogram from another instance of StatefulTractogram. |
Return a list of the data_per_point attribute names |
|
Return a list of the data_per_streamline attribute names |
|
Safe getter for streamlines (for slicing) |
|
Verify that the bounding box is valid in voxel space. |
|
|
Remove streamlines with invalid coordinates from the object. |
Safe function to shift streamlines so the center of voxel is the origin |
|
Safe function to shift streamlines so the corner of voxel is the origin |
|
|
Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner) |
|
Safe function to transform streamlines and update state |
|
Safe function to transform streamlines to a particular space using an enum and update state |
|
Safe function to transform streamlines and update state |
|
Safe function to transform streamlines and update state |
__init__
(streamlines, reference, space, origin=<Origin.NIFTI: 'center'>, data_per_point=None, data_per_streamline=None)Create a strict, state-aware, robust tractogram
Streamlines of the tractogram
Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attributes. Typically a nifti-related object from the native diffusion used for streamlines generation
Current space in which the streamlines are (vox, voxmm or rasmm) After tracking the space is VOX, after loading with nibabel the space is RASMM
Current origin in which the streamlines are (center or corner) After loading with nibabel the origin is CENTER
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
Dictionary in which each key has X items X being the number of streamlines
Notes
Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.
Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.
In a case of manipulation not allowed by this object, use Nibabel directly and be careful.
are_compatible
(sft_1, sft_2)Compatibility verification of two StatefulTractogram to ensure space, origin, data_per_point and data_per_streamline consistency
compute_bounding_box
()Compute the bounding box of the streamlines in their current state
8 corners of the XYZ aligned box, all zeros if no streamlines
from_sft
(streamlines, sft, data_per_point=None, data_per_streamline=None)Create an instance of StatefulTractogram from another instance of StatefulTractogram.
Streamlines of the tractogram
The other StatefulTractgram to copy the space_attribute AND state from.
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
Dictionary in which each key has X items X being the number of streamlines
is_bbox_in_vox_valid
()Verify that the bounding box is valid in voxel space. Negative coordinates or coordinates above the volume dimensions are considered invalid in voxel space.
Are the streamlines within the volume of the associated reference
remove_invalid_streamlines
(epsilon=0.001)Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero
Epsilon value for the bounding box verification. Default is 1e-6.
Tuple of two list, indices_to_remove, indices_to_keep
to_origin
(target_origin)Safe function to change streamlines to a particular origin standard False means NIFTI (center) and True means TrackVis (corner)
Tractogram
dipy.io.streamline.
Tractogram
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)Bases: object
Container for streamlines and their data information.
Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) 3.
Moreover, when streamlines are mapped back to voxel space 4, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.
References
http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
ArraySequence
objectSequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
PerArrayDict
objectDictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).
PerArraySequenceDict
objectDictionary where the items are (str, ArraySequence
). Each key
represents a piece of information \(i\) to be kept alongside every point
of every streamline, and its associated value is an iterable of
ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points
for a particular streamline \(t\) and \(M_i\) is the number values to store
for that particular piece of information \(i\).
Methods
|
Applies an affine transformation on the points of each streamline. |
|
Returns a copy of this |
|
Appends the data of another |
|
Brings the streamlines to world space (i.e. |
__init__
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)ArraySequence
, optionalSequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).
Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.
apply_affine
(affine, lazy=False)Applies an affine transformation on the points of each streamline.
If lazy is not specified, this is performed in-place.
Transformation that will be applied to every streamline.
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines
are modified in-place.
Tractogram
or LazyTractogram
objectTractogram where the streamlines have been transformed according
to the given affine transformation. If the lazy option is true,
it returns a LazyTractogram
object, otherwise it returns a
reference to this Tractogram
object with updated
streamlines.
copy
()Returns a copy of this Tractogram
object.
extend
(other)Appends the data of another Tractogram
.
Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.
Tractogram
objectIts data will be appended to the data of this tractogram.
Notes
The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.
to_world
(lazy=False)Brings the streamlines to world space (i.e. RAS+ and mm).
If lazy is not specified, this is performed in-place.
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines
are modified in-place.
Tractogram
or LazyTractogram
objectTractogram where the streamlines have been sent to world space.
If the lazy option is true, it returns a LazyTractogram
object, otherwise it returns a reference to this
Tractogram
object with updated streamlines.
dipy.io.streamline.
detect_format
(fileobj)Returns the StreamlinesFile object guessed from the file-like object.
If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)
TractogramFile
classThe class type guessed from the content of fileobj.
dipy.io.streamline.
is_header_compatible
(reference_1, reference_2)Will compare the spatial attribute of 2 references
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Does all the spatial attribute match
dipy.io.streamline.
load_dpy
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram of the .dpy format
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_fib
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram of the .fib format
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_generator
(ttype)Generate a loading function that performs a file extension check to restrict the user to a single file format.
Extension of the file format that requires a loader
Function (load_tractogram) that handle only one file format
dipy.io.streamline.
load_tck
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram of the .tck format
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_tractogram
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram from any format (trk, tck, vtk, fib, dpy)
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_trk
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram of the .trk format
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_vtk
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, to_origin=<Origin.NIFTI: 'center'>, bbox_valid_check=True, trk_header_check=True)Load the stateful tractogram of the .vtk format
Filename with valid extension
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
Space to which the streamlines will be transformed after loading
NIFTI standard, default (center of the voxel) TRACKVIS standard (corner of the voxel)
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
Verification that the reference has the same header as the spatial attributes as the input tractogram when a Trk is loaded
The tractogram to load (must have been saved properly)
dipy.io.streamline.
load_vtk_streamlines
(filename, to_lps=True)Load streamlines from vtk polydata.
Load formats can be VTK, FIB
input filename (.vtk or .fib)
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
list of 2D arrays
dipy.io.streamline.
save_dpy
(sft, filename, bbox_valid_check=True)Save the stateful tractogram of the .dpy format
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_fib
(sft, filename, bbox_valid_check=True)Save the stateful tractogram of the .fib format
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_generator
(ttype)Generate a saving function that performs a file extension check to restrict the user to a single file format.
Extension of the file format that requires a saver
Function (save_tractogram) that handle only one file format
dipy.io.streamline.
save_tck
(sft, filename, bbox_valid_check=True)Save the stateful tractogram of the .tck format
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_tractogram
(sft, filename, bbox_valid_check=True)Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy)
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_trk
(sft, filename, bbox_valid_check=True)Save the stateful tractogram of the .trk format
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_vtk
(sft, filename, bbox_valid_check=True)Save the stateful tractogram of the .vtk format
The stateful tractogram to save
Filename with valid extension
Verification for negative voxel coordinates or values above the volume dimensions. Default is True, to enforce valid file.
True if the saving operation was successful
dipy.io.streamline.
save_vtk_streamlines
(streamlines, filename, to_lps=True, binary=False)Save streamlines as vtk polydata to a supported format file.
File formats can be OBJ, VTK, FIB, PLY, STL and XML
list of 2D arrays or ArraySequence
output filename (.obj, .vtk, .fib, .ply, .stl and .xml)
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
save the file as binary
Nifti1Image
dipy.io.utils.
Nifti1Image
(dataobj, affine, header=None, extra=None, file_map=None)Bases: nibabel.nifti1.Nifti1Pair
, nibabel.filebasedimages.SerializableImage
Class for single file NIfTI1 format image
in_memory
True when any array data is in memory cache
slicer
Slicer object that returns cropped and subsampled images
Methods
|
alias of |
|
alias of |
|
Apply an orientation change and return a new image |
|
Make file_map for this class from filename filespec |
|
filespec_to_files class method is deprecated. |
|
Construct image from a byte string |
|
Class method to create image from mapping in |
|
Class method to create image from filename filename |
|
from_files class method is deprecated. |
|
Class method to create new instance of own class from img |
|
Get affine from image |
|
Return image data from image with any necessary scaling applied |
|
Return floating point image data with necessary scaling applied |
|
Fetch the image filename |
|
Get header from image |
|
Return 4x4 affine matrix from qform parameters in header |
|
Return 4x4 affine matrix from sform parameters in header |
|
Return shape for image |
alias of |
|
|
Save img in our own format, to name implied by filename |
|
Class method to create image from filename filename |
|
Class method to make files holder for this image type |
|
Plot the image using OrthoSlicer3D |
|
Return True if filename may be image matching this class |
|
Sets the files in the object from a given filename |
|
Set qform header values from 4x4 affine |
|
Set sform transform from 4x4 affine |
|
Return a |
|
Write image to file_map or contained |
|
Write image to files implied by filename string |
|
to_files method is deprecated. |
|
to_filespec method is deprecated. |
|
Delete any cached read of data from proxied data |
Harmonize header with image data and affine |
get_data_dtype |
|
set_data_dtype |
__init__
(dataobj, affine, header=None, extra=None, file_map=None)Initialize image
The image is a combination of (array-like, affine matrix, header), with optional metadata in extra, and filename / file-like objects contained in the file_map mapping.
Object containg image data. It should be some object that retuns an
array from np.asanyarray
. It should have a shape
attribute
or property
homogenous affine giving relationship between voxel coordinates and
world coordinates. Affine can also be None. In this case,
obj.affine
also returns None, and the affine as written to disk
will depend on the file format.
metadata for this image format
metadata to associate with image that cannot be stored in the metadata of this image type
mapping giving file information for this image format
Notes
If both a header and an affine are specified, and the affine does
not match the affine that is in the header, the affine will be used,
but the sform_code
and qform_code
fields in the header will be
re-initialised to their default values. This is performed on the basis
that, if you are changing the affine, you are likely to be changing the
space to which the affine is pointing. The set_sform()
and
set_qform()
methods can be used to update the codes after an image
has been created - see those methods, and the manual for more details.
dipy.io.utils.
decfa
(img_orig, scale=False)Create a nifti-compliant directional-encoded color FA image.
Contains encoding of the DEC FA image with a 4D volume of data, where the elements on the last dimension represent R, G and B components.
Whether to scale the incoming data from the 0-1 to the 0-255 range expected in the output.
uint8 in (R, G, B) order.
Notes
For a description of this format, see:
https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html
dipy.io.utils.
decfa_to_float
(img_orig)Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution.
Contains encoding of the DEC FA image with a 3D volume of data, where each element is a (R, G, B) tuple in uint8.
Notes
For a description of this format, see:
https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html
dipy.io.utils.
detect_format
(fileobj)Returns the StreamlinesFile object guessed from the file-like object.
If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)
TractogramFile
classThe class type guessed from the content of fileobj.
dipy.io.utils.
get_reference_info
(reference)Will compare the spatial attribute of 2 references
trk.header (dict) Reference that provides the spatial attribute.
affine ndarray (4,4), np.float32, tranformation of VOX to RASMM
dimensions ndarray (3,), int16, volume shape for each axis
voxel_sizes ndarray (3,), float32, size of voxel for each axis
voxel_order, string, Typically ‘RAS’ or ‘LPS’
dipy.io.utils.
is_header_compatible
(reference_1, reference_2)Will compare the spatial attribute of 2 references
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
Does all the spatial attribute match
dipy.io.utils.
is_reference_info_valid
(affine, dimensions, voxel_sizes, voxel_order)Validate basic data type and value of spatial attribute.
Does not ensure that voxel_sizes and voxel_order are self-coherent with the affine. Only verify the following:
affine is of the right type (float) and dimension (4,4)
affine contain values in the rotation part
dimensions is of right type (int) and length (3)
voxel_sizes is of right type (float) and length (3)
voxel_order is of right type (str) and length (3)
The listed parameters are what is expected, provide something else and this function should fail (cover common mistakes).
Tranformation of VOX to RASMM
Volume shape for each axis
Size of voxel for each axis
Typically ‘RAS’ or ‘LPS’
Does the input represent a valid ‘state’ of spatial attribute
dipy.io.utils.
nifti1_symmat
(image_data, *args, **kwargs)Returns a Nifti1Image with a symmetric matrix intent
should have lower triangular elements of a symmetric matrix along the last dimension
5d, extra dimensions addes before the last. Has symmetric matrix intent code
dipy.io.utils.
optional_package
(name, trip_msg=None)Return package-like thing and module setup for package name
package name
message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.
TripWire
instanceIf we can import the package, return it. Otherwise return an object raising an error when accessed
True if import for package was successful, false otherwise
callable usually set as setup_module
in calling namespace, to allow
skipping tests.
Examples
Typical use would be something like this at the top of a module using an optional package:
>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')
Of course in this case the package doesn’t exist, and so, in the module:
>>> have_pkg
False
and
>>> pkg.some_function()
Traceback (most recent call last):
...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError
If the module does exist - we get the module
>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True
Or a submodule if that’s what we asked for
>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True
dipy.io.utils.
read_img_arr_or_path
(data, affine=None)Helper function that handles inputs that can be paths, nifti img or arrays
Either as a 3D/4D array or as a nifti image object, or as a string containing the full path to a nifti file.
Must be provided for data provided as an array. If provided together with Nifti1Image or str data, this input will over-ride the affine that is stored in the data input. Default: use the affine stored in data.
dipy.io.vtk.
load_vtk_streamlines
(filename, to_lps=True)Load streamlines from vtk polydata.
Load formats can be VTK, FIB
input filename (.vtk or .fib)
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
list of 2D arrays
dipy.io.vtk.
optional_package
(name, trip_msg=None)Return package-like thing and module setup for package name
package name
message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.
TripWire
instanceIf we can import the package, return it. Otherwise return an object raising an error when accessed
True if import for package was successful, false otherwise
callable usually set as setup_module
in calling namespace, to allow
skipping tests.
Examples
Typical use would be something like this at the top of a module using an optional package:
>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')
Of course in this case the package doesn’t exist, and so, in the module:
>>> have_pkg
False
and
>>> pkg.some_function()
Traceback (most recent call last):
...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError
If the module does exist - we get the module
>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True
Or a submodule if that’s what we asked for
>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True
dipy.io.vtk.
save_vtk_streamlines
(streamlines, filename, to_lps=True, binary=False)Save streamlines as vtk polydata to a supported format file.
File formats can be OBJ, VTK, FIB, PLY, STL and XML
list of 2D arrays or ArraySequence
output filename (.obj, .vtk, .fib, .ply, .stl and .xml)
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
save the file as binary
dipy.io.vtk.
transform_streamlines
(streamlines, mat, in_place=False)Apply affine transformation to streamlines
Streamlines object
transformation matrix
If True then change data in place. Be careful changes input streamlines.
Sequence transformed 2D ndarrays of shape[-1]==3