API

Entry Points

Canvas

Canvas([plot_width, plot_height, x_range, …]) An abstract canvas representing the space in which to bin.
Canvas.line(source, x, y[, agg, axis]) Compute a reduction by pixel, mapping data to pixels as one or more lines.
Canvas.points(source, x, y[, agg]) Compute a reduction by pixel, mapping data to pixels as points.
Canvas.raster(source[, layer, …]) Sample a raster dataset by canvas size and bounds.
Canvas.trimesh(vertices, simplices[, mesh, …]) Compute a reduction by pixel, mapping data to pixels as a triangle.
Canvas.validate() Check that parameter settings are valid for this object

Pipeline

Pipeline(df, glyph[, agg, transform_fn, …]) A datashading pipeline callback.

Edge Bundling

directly_connect_edges alias of datashader.bundling.connect_edges
hammer_bundle(**params) Iteratively group edges and return as paths suitable for datashading.

Glyphs

Point

Point(x, y) A point, with center at x and y.
Point.inputs
Point.validate(in_dshape)

Line

Line
Line.inputs
Line.validate

Reductions

any([column]) Whether any elements in column map to each bin.
count([column]) Count elements in each bin.
count_cat([column]) Count of all elements in column, grouped by category.
first([column]) First value encountered in column.
last([column]) Last value encountered in column.
m2([column]) Sum of square differences from the mean of all elements in column.
max([column]) Maximum value of all elements in column.
mean([column]) Mean of all elements in column.
min([column]) Minimum value of all elements in column.
mode([column]) Mode (most common value) of all the values encountered in column.
std([column]) Standard Deviation of all elements in column.
sum([column]) Sum of all elements in column.
summary(**kwargs) A collection of named reductions.
var([column]) Variance of all elements in column.

Transfer Functions

Image

Image(data[, coords, dims, name, attrs, …])
Attributes:
Image.to_bytesio([format, origin])
Image.to_pil([origin])

Images

Images(*images) A list of HTML-representable objects to display in a table.
Images.cols(n) Set the number of columns to use in the HTML table.

Other

dynspread(img[, threshold, max_px, shape, …]) Spread pixels in an image dynamically based on the image density.
set_background(img[, color, name]) Return a new image, with the background set to color.
shade(agg[, cmap, color_key, how, alpha, …]) Convert a DataArray to an image by choosing an RGBA pixel color for each value.
spread(img[, px, shape, how, mask, name]) Spread pixels in an image.
stack(*imgs, **kwargs) Combine images together, overlaying later images onto earlier ones.

Definitions

class datashader.Canvas(plot_width=600, plot_height=600, x_range=None, y_range=None, x_axis_type='linear', y_axis_type='linear')[source]

An abstract canvas representing the space in which to bin.

Parameters:
plot_width, plot_height : int, optional

Width and height of the output aggregate in pixels.

x_range, y_range : tuple, optional

A tuple representing the bounds inclusive space [min, max] along the axis.

x_axis_type, y_axis_type : str, optional

The type of the axis. Valid options are 'linear' [default], and 'log'.

Methods

line(source, x, y[, agg, axis]) Compute a reduction by pixel, mapping data to pixels as one or more lines.
points(source, x, y[, agg]) Compute a reduction by pixel, mapping data to pixels as points.
raster(source[, layer, upsample_method, …]) Sample a raster dataset by canvas size and bounds.
trimesh(vertices, simplices[, mesh, agg, …]) Compute a reduction by pixel, mapping data to pixels as a triangle.
validate() Check that parameter settings are valid for this object
class datashader.Pipeline(df, glyph, agg=<datashader.reductions.count object>, transform_fn=<function identity>, color_fn=<function shade>, spread_fn=<function dynspread>, width_scale=1.0, height_scale=1.0)[source]

A datashading pipeline callback.

Given a declarative specification, creates a callable with the following signature:

callback(x_range, y_range, width, height)

where x_range and y_range form the bounding box on the viewport, and width and height specify the output image dimensions.

Parameters:
df : pandas.DataFrame, dask.DataFrame
glyph : Glyph

The glyph to bin by.

agg : Reduction, optional

The reduction to compute per-pixel. Default is count().

transform_fn : callable, optional

A callable that takes the computed aggregate as an argument, and returns another aggregate. This can be used to do preprocessing before passing to the color_fn function.

color_fn : callable, optional

A callable that takes the output of tranform_fn, and returns an Image object. Default is shade.

spread_fn : callable, optional

A callable that takes the output of color_fn, and returns another Image object. Default is dynspread.

height_scale: float, optional

Factor by which to scale the provided height

width_scale: float, optional

Factor by which to scale the provided width

Methods

__call__([x_range, y_range, width, height]) Compute an image from the specified pipeline.
datashader.bundling.directly_connect_edges

alias of datashader.bundling.connect_edges

class datashader.bundling.hammer_bundle(**params)[source]

Iteratively group edges and return as paths suitable for datashading.

Breaks each edge into a path with multiple line segments, and iteratively curves this path to bundle edges into groups.

Methods

__call__(nodes, edges, **params) Convert a graph data structure into a path structure for plotting
debug(**kwargs) Inspect .param.debug method for the full docstring
defaults(**kwargs) Inspect .param.defaults method for the full docstring
force_new_dynamic_value
get_param_values
get_value_generator
inspect_value
instance
message(**kwargs) Inspect .param.message method for the full docstring
params
pprint([imports, prefix, unknown_value, …]) Same as Parameterized.pprint, except that X.classname(Y is replaced with X.classname.instance(Y
print_param_defaults(*args, **kwargs) Inspect .param.print_param_defaults method for the full docstring
print_param_values(**kwargs) Inspect .param.print_param_values method for the full docstring
script_repr([imports, prefix]) Same as Parameterized.script_repr, except that X.classname(Y is replaced with X.classname.instance(Y
set_default(*args, **kwargs) Inspect .param.set_default method for the full docstring
set_dynamic_time_fn
set_param
state_pop() Restore the most recently saved state.
state_push() Save this instance’s state.
verbose(**kwargs) Inspect .param.verbose method for the full docstring
warning(**kwargs) Inspect .param.warning method for the full docstring
param String x (allow_None=False, basestring=<class ‘str’>, constant=False, default=x, instantiate=False, label=X, name=x, owner=<class ‘datashader.bundling.connect_edges’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, regex=None, watchers={})
Column name for each node’s x coordinate.
param String y (allow_None=False, basestring=<class ‘str’>, constant=False, default=y, instantiate=False, label=Y, name=y, owner=<class ‘datashader.bundling.connect_edges’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, regex=None, watchers={})
Column name for each node’s y coordinate.
param String source (allow_None=False, basestring=<class ‘str’>, constant=False, default=source, instantiate=False, label=Source, name=source, owner=<class ‘datashader.bundling.connect_edges’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, regex=None, watchers={})
Column name for each edge’s source.
param String target (allow_None=False, basestring=<class ‘str’>, constant=False, default=target, instantiate=False, label=Target, name=target, owner=<class ‘datashader.bundling.connect_edges’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, regex=None, watchers={})
Column name for each edge’s target.
param String weight (allow_None=True, basestring=<class ‘str’>, constant=False, default=weight, instantiate=False, label=Weight, name=weight, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, regex=None, watchers={})
Column name for each edge weight. If None, weights are ignored.
param Boolean include_edge_id (allow_None=False, bounds=(0, 1), constant=False, default=False, instantiate=False, label=Include edge id, name=include_edge_id, owner=<class ‘datashader.bundling.connect_edges’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, watchers={})
Include edge IDs in bundled dataframe
param Number initial_bandwidth (allow_None=False, bounds=(0.0, None), constant=False, default=0.05, inclusive_bounds=(True, True), instantiate=False, label=Initial bandwidth, name=initial_bandwidth, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Initial value of the bandwidth….
param Number decay (allow_None=False, bounds=(0.0, 1.0), constant=False, default=0.7, inclusive_bounds=(True, True), instantiate=False, label=Decay, name=decay, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Rate of decay in the bandwidth value, with 1.0 indicating no decay.
param Integer iterations (allow_None=False, bounds=(1, None), constant=False, default=4, inclusive_bounds=(True, True), instantiate=False, label=Iterations, name=iterations, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Number of passes for the smoothing algorithm
param Integer batch_size (allow_None=False, bounds=(1, None), constant=False, default=20000, inclusive_bounds=(True, True), instantiate=False, label=Batch size, name=batch_size, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=None, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Number of edges to process together
param Number tension (allow_None=False, bounds=(0, None), constant=False, default=0.3, inclusive_bounds=(True, True), instantiate=False, label=Tension, name=tension, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=-0.5, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Exponential smoothing factor to use when smoothing
param Integer accuracy (allow_None=False, bounds=(1, None), constant=False, default=500, inclusive_bounds=(True, True), instantiate=False, label=Accuracy, name=accuracy, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=-0.5, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Number of entries in table for…
param Integer advect_iterations (allow_None=False, bounds=(0, None), constant=False, default=50, inclusive_bounds=(True, True), instantiate=False, label=Advect iterations, name=advect_iterations, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=-0.5, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Number of iterations to move edges along gradients
param Number min_segment_length (allow_None=False, bounds=(0, None), constant=False, default=0.008, inclusive_bounds=(True, True), instantiate=False, label=Min segment length, name=min_segment_length, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=-0.5, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Minimum length (in data space?) for an edge segment
param Number max_segment_length (allow_None=False, bounds=(0, None), constant=False, default=0.016, inclusive_bounds=(True, True), instantiate=False, label=Max segment length, name=max_segment_length, owner=<class ‘datashader.bundling.hammer_bundle’>, per_instance=True, pickle_default_value=True, precedence=-0.5, readonly=False, softbounds=None, step=None, time_dependent=False, time_fn=<Time Time00001>, watchers={})
Maximum length (in data space?) for an edge segment
class datashader.glyphs.Point(x, y)[source]

A point, with center at x and y.

Points map each record to a single bin. Points falling exactly on the upper bounds are treated as a special case, mapping into the previous bin rather than being cropped off.

Parameters:
x, y : str

Column names for the x and y coordinates of each point.

Attributes:
inputs
x_label
y_label

Methods

compute_bounds_dask  
compute_x_bounds  
compute_y_bounds  
maybe_expand_bounds  
required_columns  
validate  
class datashader.reductions.any(column=None)[source]

Whether any elements in column map to each bin.

Parameters:
column : str, optional

If provided, only elements in column that are NaN are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.count(column=None)[source]

Count elements in each bin.

Parameters:
column : str, optional

If provided, only counts elements in column that are not NaN. Otherwise, counts every element.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.count_cat(column=None)[source]

Count of all elements in column, grouped by category.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be categorical. Resulting aggregate has a outer dimension axis along the categories present.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.first(column=None)[source]

First value encountered in column.

Useful for categorical data where an actual value must always be returned, not an average or other numerical calculation.

Currently only supported for rasters, externally to this class.

Parameters:
column : str

Name of the column to aggregate over. If the data type is floating point, NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.last(column=None)[source]

Last value encountered in column.

Useful for categorical data where an actual value must always be returned, not an average or other numerical calculation.

Currently only supported for rasters, externally to this class.

Parameters:
column : str

Name of the column to aggregate over. If the data type is floating point, NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.m2(column=None)[source]

Sum of square differences from the mean of all elements in column.

Intermediate value for computing var and std, not intended to be used on its own.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.max(column=None)[source]

Maximum value of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.mean(column=None)[source]

Mean of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.min(column=None)[source]

Minimum value of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.mode(column=None)[source]

Mode (most common value) of all the values encountered in column.

Useful for categorical data where an actual value must always be returned, not an average or other numerical calculation.

Currently only supported for rasters, externally to this class. Implementing it for other glyph types would be difficult due to potentially unbounded data storage requirements to store indefinite point or line data per pixel.

Parameters:
column : str

Name of the column to aggregate over. If the data type is floating point, NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.std(column=None)[source]

Standard Deviation of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.sum(column=None)[source]

Sum of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.summary(**kwargs)[source]

A collection of named reductions.

Computes all aggregates simultaneously, output is stored as a xarray.Dataset.

Examples

A reduction for computing the mean of column “a”, and the sum of column “b” for each bin, all in a single pass.

>>> import datashader as ds
>>> red = ds.summary(mean_a=ds.mean('a'), sum_b=ds.sum('b'))
Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.reductions.var(column=None)[source]

Variance of all elements in column.

Parameters:
column : str

Name of the column to aggregate over. Column data type must be numeric. NaN values in the column are skipped.

Attributes:
inputs

Methods

out_dshape  
validate  
class datashader.transfer_functions.Image(data, coords=None, dims=None, name=None, attrs=None, encoding=None, fastpath=False)[source]
Attributes:
T
attrs

Dictionary storing arbitrary metadata with this array.

chunks

Block dimensions for this array’s data or None if it’s not a dask array.

coords

Dictionary-like container of coordinate arrays.

data

The array’s data as a dask or numpy array

dims

Tuple of dimension names associated with this array.

dt

Access datetime fields for DataArrays with datetime-like dtypes.

dtype
encoding

Dictionary of format-specific settings for how this array should be serialized.

imag
indexes

OrderedDict of pandas.Index objects used for label based indexing

loc

Attribute for location based indexing like pandas.

name

The name of this array.

nbytes
ndim
plot

Access plotting functions

real
shape
size
sizes

Ordered mapping from dimension names to lengths.

values

The array’s data as a numpy.ndarray

variable

Low level interface to the Variable object for this DataArray.

Methods

all([dim, axis]) Reduce this DataArray’s data by applying all along some dimension(s).
any([dim, axis]) Reduce this DataArray’s data by applying any along some dimension(s).
argmax([dim, axis, skipna]) Reduce this DataArray’s data by applying argmax along some dimension(s).
argmin([dim, axis, skipna]) Reduce this DataArray’s data by applying argmin along some dimension(s).
argsort([axis, kind, order]) Returns the indices that would sort this array.
assign_attrs(*args, **kwargs) Assign new attrs to this object.
assign_coords(**kwargs) Assign new coordinates to this object.
astype(dtype[, order, casting, subok, copy]) Copy of the array, cast to a specified type.
bfill(dim[, limit]) Fill NaN values by propogating values backward
broadcast_equals(other) Two DataArrays are broadcast equal if they are equal after broadcasting them against each other such that they have the same dimensions.
chunk([chunks, name_prefix, token, lock]) Coerce this array’s data into a dask arrays with the given chunks.
clip([min, max, out]) Return an array whose values are limited to [min, max].
close() Close any files linked to this object
combine_first(other) Combine two DataArray objects, with union of coordinates.
compute(**kwargs) Manually trigger loading of this array’s data from disk or a remote source into memory and return a new array.
conj() Complex-conjugate all elements.
conjugate() Return the complex conjugate, element-wise.
copy([deep, data]) Returns a copy of this array.
count([dim, axis]) Reduce this DataArray’s data by applying count along some dimension(s).
cumprod([dim, axis, skipna]) Apply cumprod along some dimension of DataArray.
cumsum([dim, axis, skipna]) Apply cumsum along some dimension of DataArray.
diff(dim[, n, label]) Calculate the n-th order discrete difference along given axis.
differentiate(coord[, edge_order, datetime_unit]) Differentiate the array with the second order accurate central differences.
dot(other[, dims]) Perform dot product of two DataArrays along their shared dims.
drop(labels[, dim]) Drop coordinates or index labels from this DataArray.
dropna(dim[, how, thresh]) Returns a new array with dropped labels for missing values along the provided dimension.
equals(other) True if two DataArrays have the same dimensions, coordinates and values; otherwise False.
expand_dims(dim[, axis]) Return a new object with an additional axis (or axes) inserted at the corresponding position in the array shape.
ffill(dim[, limit]) Fill NaN values by propogating values forward
fillna(value) Fill missing values in this object.
from_cdms2(variable) Convert a cdms2.Variable into an xarray.DataArray
from_dict(d) Convert a dictionary into an xarray.DataArray
from_iris(cube) Convert a iris.cube.Cube into an xarray.DataArray
from_series(series) Convert a pandas.Series into an xarray.DataArray.
get_axis_num(dim) Return axis number(s) corresponding to dimension(s) in this array.
get_index(key) Get an index for a dimension, with fall-back to a default RangeIndex
groupby(group[, squeeze]) Returns a GroupBy object for performing grouped operations.
groupby_bins(group, bins[, right, labels, …]) Returns a GroupBy object for performing grouped operations.
identical(other) Like equals, but also checks the array name and attributes, and attributes on all coordinates.
interp([coords, method, assume_sorted, kwargs]) Multidimensional interpolation of variables.
interp_like(other[, method, assume_sorted, …]) Interpolate this object onto the coordinates of another object, filling out of range values with NaN.
interpolate_na([dim, method, limit, …]) Interpolate values according to different methods.
isel([indexers, drop]) Return a new DataArray whose dataset is given by integer indexing along the specified dimension(s).
isel_points([dim]) Return a new DataArray whose dataset is given by pointwise integer indexing along the specified dimension(s).
isin(test_elements) Tests each value in the array for whether it is in the supplied list.
item(*args) Copy an element of an array to a standard Python scalar and return it.
load(**kwargs) Manually trigger loading of this array’s data from disk or a remote source into memory and return this array.
max([dim, axis, skipna]) Reduce this DataArray’s data by applying max along some dimension(s).
mean([dim, axis, skipna]) Reduce this DataArray’s data by applying mean along some dimension(s).
median([dim, axis, skipna]) Reduce this DataArray’s data by applying median along some dimension(s).
min([dim, axis, skipna]) Reduce this DataArray’s data by applying min along some dimension(s).
persist(**kwargs) Trigger computation in constituent dask arrays
pipe(func, *args, **kwargs) Apply func(self, *args, **kwargs)
prod([dim, axis, skipna]) Reduce this DataArray’s data by applying prod along some dimension(s).
quantile(q[, dim, interpolation, keep_attrs]) Compute the qth quantile of the data along the specified dimension.
rank(dim[, pct, keep_attrs]) Ranks the data.
reduce(func[, dim, axis, keep_attrs]) Reduce this array by applying func along some dimension(s).
reindex([indexers, method, tolerance, copy]) Conform this object onto a new set of indexes, filling in missing values with NaN.
reindex_like(other[, method, tolerance, copy]) Conform this object onto the indexes of another object, filling in missing values with NaN.
rename([new_name_or_name_dict]) Returns a new DataArray with renamed coordinates or a new name.
reorder_levels([dim_order, inplace]) Rearrange index levels using input order.
resample([indexer, skipna, closed, label, …]) Returns a Resample object for performing resampling operations.
reset_coords([names, drop, inplace]) Given names of coordinates, reset them to become variables.
reset_index(dims_or_levels[, drop, inplace]) Reset the specified index(es) or multi-index level(s).
roll([shifts, roll_coords]) Roll this array by an offset along one or more dimensions.
rolling([dim, min_periods, center]) Rolling window object.
searchsorted(v[, side, sorter]) Find indices where elements of v should be inserted in a to maintain order.
sel([indexers, method, tolerance, drop]) Return a new DataArray whose dataset is given by selecting index labels along the specified dimension(s).
sel_points([dim, method, tolerance]) Return a new DataArray whose dataset is given by pointwise selection of index labels along the specified dimension(s).
set_index([indexes, append, inplace]) Set DataArray (multi-)indexes using one or more existing coordinates.
shift([shifts, fill_value]) Shift this array by an offset along one or more dimensions.
sortby(variables[, ascending]) Sort object by labels or values (along an axis).
squeeze([dim, drop, axis]) Return a new object with squeezed data.
stack([dimensions]) Stack any number of existing dimensions into a single new dimension.
std([dim, axis, skipna]) Reduce this DataArray’s data by applying std along some dimension(s).
sum([dim, axis, skipna]) Reduce this DataArray’s data by applying sum along some dimension(s).
swap_dims(dims_dict) Returns a new DataArray with swapped dimensions.
to_cdms2() Convert this array into a cdms2.Variable
to_dataframe([name]) Convert this array and its coordinates into a tidy pandas.DataFrame.
to_dataset([dim, name]) Convert a DataArray to a Dataset.
to_dict() Convert this xarray.DataArray into a dictionary following xarray naming conventions.
to_index() Convert this variable to a pandas.Index.
to_iris() Convert this array into a iris.cube.Cube
to_masked_array([copy]) Convert this array into a numpy.ma.MaskedArray
to_netcdf(*args, **kwargs) Write DataArray contents to a netCDF file.
to_pandas() Convert this array into a pandas object with the same shape.
to_series() Convert this array into a pandas.Series.
transpose(*dims) Return a new DataArray object with transposed dimensions.
unstack([dim]) Unstack existing dimensions corresponding to MultiIndexes into multiple new dimensions.
var([dim, axis, skipna]) Reduce this DataArray’s data by applying var along some dimension(s).
where(cond[, other, drop]) Filter elements from this object according to a condition.
isnull  
notnull  
round  
to_bytesio  
to_pil  
datashader.transfer_functions.stack(*imgs, **kwargs)[source]

Combine images together, overlaying later images onto earlier ones.

Parameters:
imgs : iterable of Image

The images to combine.

how : str, optional

The compositing operator to combine pixels. Default is ‘over’.

datashader.transfer_functions.shade(agg, cmap=['lightblue', 'darkblue'], color_key=['#e41a1c', '#377eb8', '#4daf4a', '#984ea3', '#ff7f00', '#ffff33', '#a65628', '#f781bf', '#999999', '#66c2a5', '#fc8d62', '#8da0cb', '#a6d854', '#ffd92f', '#e5c494', '#ffffb3', '#fb8072', '#fdb462', '#fccde5', '#d9d9d9', '#ccebc5', '#ffed6f'], how='eq_hist', alpha=255, min_alpha=40, span=None, name=None)[source]

Convert a DataArray to an image by choosing an RGBA pixel color for each value.

Requires a DataArray with a single data dimension, here called the “value”, indexed using either 2D or 3D coordinates.

For a DataArray with 2D coordinates, the RGB channels are computed from the values by interpolated lookup into the given colormap cmap. The A channel is then set to the given fixed alpha value for all non-zero values, and to zero for all zero values.

DataArrays with 3D coordinates are expected to contain values distributed over different categories that are indexed by the additional coordinate. Such an array would reduce to the 2D-coordinate case if collapsed across the categories (e.g. if one did aggc.sum(dim='cat') for a categorical dimension cat). The RGB channels for the uncollapsed, 3D case are computed by averaging the colors in the provided color_key (with one color per category), weighted by the array’s value for that category. The A channel is then computed from the array’s total value collapsed across all categories at that location, ranging from the specified min_alpha to the maximum alpha value (255).

Parameters:
agg : DataArray
cmap : list of colors or matplotlib.colors.Colormap, optional

The colormap to use for 2D agg arrays. Can be either a list of colors (specified either by name, RGBA hexcode, or as a tuple of (red, green, blue) values.), or a matplotlib colormap object. Default is ["lightblue", "darkblue"].

color_key : dict or iterable

The colors to use for a 3D (categorical) agg array. Can be either a dict mapping from field name to colors, or an iterable of colors in the same order as the record fields, and including at least that many distinct colors.

how : str or callable, optional

The interpolation method to use, for the cmap of a 2D DataArray or the alpha channel of a 3D DataArray. Valid strings are ‘eq_hist’ [default], ‘cbrt’ (cube root), ‘log’ (logarithmic), and ‘linear’. Callables take 2 arguments - a 2-dimensional array of magnitudes at each pixel, and a boolean mask array indicating missingness. They should return a numeric array of the same shape, with NaN values where the mask was True.

alpha : int, optional

Value between 0 - 255 representing the alpha value to use for colormapped pixels that contain data (i.e. non-NaN values). Regardless of this value, NaN values are set to be fully transparent when doing colormapping.

min_alpha : float, optional

The minimum alpha value to use for non-empty pixels when doing colormapping, in [0, 255]. Use a higher value to avoid undersaturation, i.e. poorly visible low-value datapoints, at the expense of the overall dynamic range.

span : list of min-max range, optional

Min and max data values to use for colormap interpolation, when wishing to override autoranging.

name : string name, optional

Optional string name to give to the Image object to return, to label results for display.

datashader.transfer_functions.set_background(img, color=None, name=None)[source]

Return a new image, with the background set to color.

Parameters:
img : Image
color : color name or tuple, optional

The background color. Can be specified either by name, hexcode, or as a tuple of (red, green, blue) values.

datashader.transfer_functions.spread(img, px=1, shape='circle', how='over', mask=None, name=None)[source]

Spread pixels in an image.

Spreading expands each pixel a certain number of pixels on all sides according to a given shape, merging pixels using a specified compositing operator. This can be useful to make sparse plots more visible.

Parameters:
img : Image
px : int, optional

Number of pixels to spread on all sides

shape : str, optional

The shape to spread by. Options are ‘circle’ [default] or ‘square’.

how : str, optional

The name of the compositing operator to use when combining pixels.

mask : ndarray, shape (M, M), optional

The mask to spread over. If provided, this mask is used instead of generating one based on px and shape. Must be a square array with odd dimensions. Pixels are spread from the center of the mask to locations where the mask is True.

name : string name, optional

Optional string name to give to the Image object to return, to label results for display.

datashader.transfer_functions.dynspread(img, threshold=0.5, max_px=3, shape='circle', how='over', name=None)[source]

Spread pixels in an image dynamically based on the image density.

Spreading expands each pixel a certain number of pixels on all sides according to a given shape, merging pixels using a specified compositing operator. This can be useful to make sparse plots more visible. Dynamic spreading determines how many pixels to spread based on a density heuristic. Spreading starts at 1 pixel, and stops when the fraction of adjacent non-empty pixels reaches the specified threshold, or the max_px is reached, whichever comes first.

Parameters:
img : Image
threshold : float, optional

A tuning parameter in [0, 1], with higher values giving more spreading.

max_px : int, optional

Maximum number of pixels to spread on all sides.

shape : str, optional

The shape to spread by. Options are ‘circle’ [default] or ‘square’.

how : str, optional

The name of the compositing operator to use when combining pixels.