[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: frustrated by hdf_sd_adddata
George McCabe wrote:
> attempting to replace a subset of existing data in an HDF SD, using
> idl procedure hdf_sd_adddata, receive the following error which i can
> not readily diagnose -
> % HDF_SD_ADDDATA: Unable to write the specified HDF-SD slice.
> hdf opens, sd starts /RDWR ok, sd selected ok
> i have veryfied the size and order of the data variable, count and
> start keyword variables.
Does the following example work on your system?
hdfid = hdf_sd_start('test.hdf')
varid = hdf_sd_create(hdfid, 'data', [256, 256], /float)
hdf_sd_adddata, varid, dist(256)
If it does, the HDF interface on you platform is working. You might be
having a problem with the definition of the START and COUNT keywords.
The following is from the IDL 5.3 online documentation:
Set this keyword to a vector that contains the starting position for the
data. The default position is [0, 0, ..., 0].
Set this keyword to a vector of counts (i.e., the number of items) to be
written in each dimension. The default is to write all available data.
Set this keyword to a vector that contains the strides, or sampling
intervals, between accessed values of the NetCDF variable. The default
stride vector is that for a contiguous write: [0, 0, ..., 0].
I'd try and reduce the code to the smallest number of lines which
reproduces the error. This process often makes the source of the error