pycroscopy issueshttps://code.ornl.gov/rvv/pycroscopy/-/issues2020-12-20T20:58:51Zhttps://code.ornl.gov/rvv/pycroscopy/-/issues/239SVD error2020-12-20T20:58:51ZVasudevan, Rama K.SVD error*Created by: sulaymandesai*
Hi,
I have been following the example notebooks on this GitHub page to perform SVD. I get the following error:
```
1 decomposer = px.processing.svd_utils.SVD(h5_main, num_components=100)
----> 2 h5_s...*Created by: sulaymandesai*
Hi,
I have been following the example notebooks on this GitHub page to perform SVD. I get the following error:
```
1 decomposer = px.processing.svd_utils.SVD(h5_main, num_components=100)
----> 2 h5_svd_group = decomposer.compute()
3
4 h5_u = h5_svd_group['U']
5 h5_v = h5_svd_group['V']
~/.pyenv/versions/3.8.3/lib/python3.8/site-packages/pycroscopy/processing/svd_utils.py in compute(self, override)
161 """
162 if self.__u is None and self.__v is None and self.__s is None:
--> 163 self.test(override=override)
164
165 if self.h5_results_grp is None:
~/.pyenv/versions/3.8.3/lib/python3.8/site-packages/pycroscopy/processing/svd_utils.py in test(self, override)
137 raise ValueError('Could not reshape U to N-Dimensional dataset! Error:' + success)
138
--> 139 v_mat, success = reshape_to_n_dims(self.__v, h5_pos=np.expand_dims(np.arange(self.__u.shape[1]), axis=1),
140 h5_spec=self.h5_main.h5_spec_inds)
141 if not success:
~/.pyenv/versions/3.8.3/lib/python3.8/site-packages/pyUSID/io/hdf_utils/model.py in reshape_to_n_dims(h5_main, h5_pos, h5_spec, get_labels, verbose, sort_dims, lazy)
84 else:
85 if not isinstance(h5_main, (h5py.Dataset, np.ndarray, da.core.Array)):
---> 86 raise TypeError('h5_main should either be a h5py.Dataset or numpy array')
87
88 if h5_pos is not None:
TypeError: h5_main should either be a h5py.Dataset or numpy array
```
Any help would be appreciated!https://code.ornl.gov/rvv/pycroscopy/-/issues/220BEOdfTranslator failure2020-05-15T19:44:20ZVasudevan, Rama K.BEOdfTranslator failure*Created by: ramav87*
BE ODF translator unable to translate certain old files collected in 2010. This error was found for nonlinearity measurements:
> ---------------------------------------------------------------------------
> K...*Created by: ramav87*
BE ODF translator unable to translate certain old files collected in 2010. This error was found for nonlinearity measurements:
> ---------------------------------------------------------------------------
> KeyError Traceback (most recent call last)
> <ipython-input-6-6c7fce848344> in <module>
> 4 translator = px.translators.BEodfTranslator()
> 5
> ----> 6 translator.translate(file_path)
>
> //anaconda3/lib/python3.7/site-packages/pycroscopy/io/translators/be_odf.py in translate(self, file_path, show_plots, save_plots, do_histogram, verbose)
> 251
> 252 if isBEPS:
> --> 253 (UDVS_labs, UDVS_units, UDVS_mat) = self.__build_udvs_table(parm_dict)
> 254
> 255 # Remove the unused plot group columns before proceeding:
>
> //anaconda3/lib/python3.7/site-packages/pycroscopy/io/translators/be_odf.py in __build_udvs_table(self, parm_dict)
> 1014 BE_amp = parm_dict['BE_amplitude_[V]']
> 1015
> -> 1016 VS_amp = parm_dict['VS_amplitude_[V]']
> 1017 VS_offset = parm_dict['VS_offset_[V]']
> 1018 # VS_read_voltage = parm_dict['VS_read_voltage_[V]']
>
> KeyError: 'VS_amplitude_[V]'
Will explore this in more detail.https://code.ornl.gov/rvv/pycroscopy/-/issues/192Out of date notebooks2018-10-08T16:58:27ZVasudevan, Rama K.Out of date notebooks*Created by: DancingQuanta*
I am trying to do a tutorial here Tutorial_04_Interactive_Visualization. But I found that some things are broken because the functions in the notebooks have been renamed, or changed functionality.
*Created by: DancingQuanta*
I am trying to do a tutorial here Tutorial_04_Interactive_Visualization. But I found that some things are broken because the functions in the notebooks have been renamed, or changed functionality.
https://code.ornl.gov/rvv/pycroscopy/-/issues/167loop_fitter missing checks for existing results2018-06-11T15:21:41ZVasudevan, Rama K.loop_fitter missing checks for existing results*Created by: ramav87*
do_guess and do_fit methods from BELoopFitter are missing override options that have been added to SHOFitter. This causes an error in the latest BE processing notebook.*Created by: ramav87*
do_guess and do_fit methods from BELoopFitter are missing override options that have been added to SHOFitter. This causes an error in the latest BE processing notebook.https://code.ornl.gov/rvv/pycroscopy/-/issues/157Reshape_to_n_dims for non-square images2018-05-26T21:22:05ZVasudevan, Rama K.Reshape_to_n_dims for non-square images*Created by: rajgiriUW*
Posted on Slack, but for tracking.
Issue is for non-square image files, I think the reshape_to_n_dims function is reordering the data when it shouldn't be, resulting in some jagged-looking images (if rows > co...*Created by: rajgiriUW*
Posted on Slack, but for tracking.
Issue is for non-square image files, I think the reshape_to_n_dims function is reordering the data when it shouldn't be, resulting in some jagged-looking images (if rows > columns) or the image is repeated vertically (if columns > rows).
Here's what happens when running this on an example:
Position dimensions: ['X' 'Y']
Position sort order: [0 1]
Spectroscopic Dimensions: ['arb']
Spectroscopic sort order: [0]
Position dimensions (sort applied): ['X' 'Y']
Position dimensionality (sort applied): [256, 128]
Spectroscopic dimensions (sort applied): ['arb']
Spectroscopic dimensionality (sort applied): [1]
After first reshape, labels are ['Y' 'X' 'arb']
Data shape is (128, 256, 1)
Axes will permuted in this order: [1 0 2]
New labels ordering: ['X' 'Y' 'arb']
Dataset now of shape: (256, 128, 1)
Suhas seems to think the issue is in line:
>> Axes will permuted in this order: [1 0 2]
Since that is changing the dimensions.
I confirmed in the Igor IBW Translator that the position dimensions are being written correctly. It is possible to correct the Translator to fix this, I think, but I then expect the issue to pop up in other translators.
https://code.ornl.gov/rvv/pycroscopy/-/issues/156svd_utils.rebuild_svd bug in create_indexed_group2018-05-22T12:02:47ZVasudevan, Rama K.svd_utils.rebuild_svd bug in create_indexed_group*Created by: rajgiriUW*
I think this was a bug in commit f327da7d98907b9ba09473055320e12ddb04e122 in updating svd_utils:
line 328:
rebuilt_grp = create_indexed_group('Rebuilt_Data', h5_svd_group.name[1:])
throws an error sinc...*Created by: rajgiriUW*
I think this was a bug in commit f327da7d98907b9ba09473055320e12ddb04e122 in updating svd_utils:
line 328:
rebuilt_grp = create_indexed_group('Rebuilt_Data', h5_svd_group.name[1:])
throws an error since create_indexed_group requires a h5Py group, not string.
I think this change should work (will test):
rebuilt_grp = create_indexed_group(h5_svd_group, 'Rebuilt_Data')https://code.ornl.gov/rvv/pycroscopy/-/issues/152svd_rebuild error with get_component_slice returning ndarray2018-05-08T17:35:34ZVasudevan, Rama K.svd_rebuild error with get_component_slice returning ndarray*Created by: rajgiriUW*
in svd_utils.rebuild_svd:
comp_slice, num_comps = get_component_slice(components, total_components=h5_main.shape[1])
will cause an error later in:
n_comps = h5_S[comp_slice].size
if comp_slice is an ndarr...*Created by: rajgiriUW*
in svd_utils.rebuild_svd:
comp_slice, num_comps = get_component_slice(components, total_components=h5_main.shape[1])
will cause an error later in:
n_comps = h5_S[comp_slice].size
if comp_slice is an ndarray rather than a list. This seems to be an H5Py limitation.https://code.ornl.gov/rvv/pycroscopy/-/issues/137sdist tarball has wrong contents2018-03-27T17:27:45ZVasudevan, Rama K.sdist tarball has wrong contents*Created by: carlodri*
the source tarball available on Pypi is not what it should be: it contains a number of nested directories but it doesn't have the structure of a package source. Is this deliberate?
This is strictly related to #...*Created by: carlodri*
the source tarball available on Pypi is not what it should be: it contains a number of nested directories but it doesn't have the structure of a package source. Is this deliberate?
This is strictly related to #134, since conda-forge is based on `sdist`tarballs.https://code.ornl.gov/rvv/pycroscopy/-/issues/136BESHOFitter does not find existing results2018-05-22T19:11:31ZVasudevan, Rama K.BESHOFitter does not find existing results*Created by: CompPhysChris*
The Fitter method _check_for_old_guess does not find completed fits.*Created by: CompPhysChris*
The Fitter method _check_for_old_guess does not find completed fits.https://code.ornl.gov/rvv/pycroscopy/-/issues/135Plot attributes are hard coded2018-05-26T21:22:55ZVasudevan, Rama K.Plot attributes are hard coded*Created by: CompPhysChris*
Some plot attributes are being hard coded rather than pulled from the datasets. Ex: Field names in visualize_sho_results.*Created by: CompPhysChris*
Some plot attributes are being hard coded rather than pulled from the datasets. Ex: Field names in visualize_sho_results.https://code.ornl.gov/rvv/pycroscopy/-/issues/130SignalFilter memory use2018-04-05T14:14:00ZVasudevan, Rama K.SignalFilter memory use*Created by: CompPhysChris*
Need a custom memory use calculation for SignalFilter. Taking the FFT of the input data converts it to complex values which increases the initial size.*Created by: CompPhysChris*
Need a custom memory use calculation for SignalFilter. Taking the FFT of the input data converts it to complex values which increases the initial size.https://code.ornl.gov/rvv/pycroscopy/-/issues/105Igor ibw translation errors in python 32018-03-12T16:11:13ZVasudevan, Rama K.Igor ibw translation errors in python 3*Created by: ssomnath*
pycroscopy\io\translators\igor_ibw.py in _read_parms(ibw_wave)
173 Dictionary containing parameters
174 """
--> 175 parm_string = ibw_wave.get('note').decode('utf-8')
17...*Created by: ssomnath*
pycroscopy\io\translators\igor_ibw.py in _read_parms(ibw_wave)
173 Dictionary containing parameters
174 """
--> 175 parm_string = ibw_wave.get('note').decode('utf-8')
176 parm_string = parm_string.rstrip('\r')
177 parm_list = parm_string.split('\r')
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb0 in position 2200: invalid start bytehttps://code.ornl.gov/rvv/pycroscopy/-/issues/104dm3 translation in python 3 results in an Assertion Error2018-01-18T13:13:28ZVasudevan, Rama K.dm3 translation in python 3 results in an Assertion Error*Created by: ssomnath*
/io/translators/df_utils/parse_dm3.py in parse_dm_tag_data(f, outdata)
227 else:
228 _delim, header_len, data_type = get_from_file(f, "> 4s l l")
--> 229 assert(_delim == "%%%%")
...*Created by: ssomnath*
/io/translators/df_utils/parse_dm3.py in parse_dm_tag_data(f, outdata)
227 else:
228 _delim, header_len, data_type = get_from_file(f, "> 4s l l")
--> 229 assert(_delim == "%%%%")
230 ret, header = dm_types[data_type](f)
231 assert(header + 1 == header_len)https://code.ornl.gov/rvv/pycroscopy/-/issues/102Incorrect reshaping on loop projection2018-01-18T13:19:26ZVasudevan, Rama K.Incorrect reshaping on loop projection*Created by: ramav87*
Error in projectLoop batch (in be_loop_model) where order of reshaped array after loop projection is reversed from what it should be. *Created by: ramav87*
Error in projectLoop batch (in be_loop_model) where order of reshaped array after loop projection is reversed from what it should be. https://code.ornl.gov/rvv/pycroscopy/-/issues/70Fix the dendrogram plotting function2018-01-19T17:12:56ZVasudevan, Rama K.Fix the dendrogram plotting function*Created by: ssomnath*
*Created by: ssomnath*
https://code.ornl.gov/rvv/pycroscopy/-/issues/64Window slices on mac are floats2018-01-18T13:10:48ZVasudevan, Rama K.Window slices on mac are floats*Created by: CompPhysChris*
The win_slices are all floats on Mac. Reported by Artem. uint+int is returning a float.*Created by: CompPhysChris*
The win_slices are all floats on Mac. Reported by Artem. uint+int is returning a float.https://code.ornl.gov/rvv/pycroscopy/-/issues/49repack in ioHDF5 still crashing2016-12-14T16:06:08ZVasudevan, Rama K.repack in ioHDF5 still crashing*Created by: ssomnath*
*Created by: ssomnath*
https://code.ornl.gov/rvv/pycroscopy/-/issues/42Better naming conventions for units in attributes2016-12-13T13:07:31ZVasudevan, Rama K.Better naming conventions for units in attributes*Created by: ssomnath*
Currently attributes are named as "attribute_1_unit" for those with units and "attribute_2" for those without units. A parser looking for certain attributes would currently absorb "unit" and "2" as the units. We s...*Created by: ssomnath*
Currently attributes are named as "attribute_1_unit" for those with units and "attribute_2" for those without units. A parser looking for certain attributes would currently absorb "unit" and "2" as the units. We should be naming the attributes to something like "attribute_1__unit" or "attribute_1-unit" to avoid confusionhttps://code.ornl.gov/rvv/pycroscopy/-/issues/41BE noise floor should be a 2D dataset and should come with its own spectrosco...2016-12-13T13:05:49ZVasudevan, Rama K.BE noise floor should be a 2D dataset and should come with its own spectroscopic datasets*Created by: ssomnath*
*Created by: ssomnath*
https://code.ornl.gov/rvv/pycroscopy/-/issues/40Problem in automatically estimating window size in image windowing2016-12-14T16:06:56ZVasudevan, Rama K.Problem in automatically estimating window size in image windowing*Created by: ssomnath*
*Created by: ssomnath*