NOTICE! This is a static HTML version of a legacy Fiji BugZilla bug.

The Fiji project now uses GitHub Issues for issue tracking.

Please file all new issues there.

Bug 989 - HDF5 plugin doesnt load
HDF5 plugin doesnt load
Status: RESOLVED FIXED
Product: Fiji
Classification: Unclassified
Component: Plugins
unspecified
PC Windows
: P4 normal
Assigned To: ImageJ Bugs Mailing List
: 995
Depends on:
Blocks:
 
Reported: 2015-01-21 09:58 CST by Tommy
Modified: 2015-02-05 08:04 CST
7 users (show)

See Also:

Description Tommy 2015-01-21 09:58:14 CST
I installed Fiji and ran the updater.  Then I added the HDF5 plugin from the manage sites window.  After I restarted Fiji, I tried to import a H5 file.  It opens up to the log and shows /imaging:DATASET but then it never brings up the java window to let me select the datasets I want to open.  I thought maybe it was this computer but it also happens on the updated version of Fiji on my MacBook Pro as well.  Any thought?  Thanks

Information about your version of Java:

  os.arch => amd64
  os.name => Windows 7
  os.version => 6.1
  java.version => 1.6.0_24
  java.vendor => Sun Microsystems Inc.
  java.runtime.name => Java(TM) SE Runtime Environment
  java.runtime.version => 1.6.0_24-b07
  java.vm.name => Java HotSpot(TM) 64-Bit Server VM
  java.vm.version => 19.1-b02
  java.vm.vendor => Sun Microsystems Inc.
  java.vm.info => mixed mode
  java.awt.graphicsenv => sun.awt.Win32GraphicsEnvironment
  java.specification.name => Java Platform API Specification
  java.specification.version => 1.6
  sun.cpu.endian => little
  sun.desktop => windows
  file.separator => \

The up-to-date check says: REMIND_LATER

Information relevant to JAVA_HOME related problems:

  JAVA_HOME is set to: C:\Users\Polleux\DOCUME~1\FIJI-W~1\Fiji.app/java/win64/jdk1.6.0_24//jre
  imagej.dir => C:\Users\Polleux\DOCUME~1\FIJI-W~1\Fiji.app

Information about the version of each plugin:

Activated update sites:
ImageJ: http://update.imagej.net/ (last check:20150116111452)
Fiji: http://fiji.sc/update/ (last check:20150116114028)
HDF5: http://sites.imagej.net/Ronneber/ (last check:20140827124014)

Files not up-to-date:
  1ad3be0d (LOCAL_ONLY) 20140115091640 jars/jpedalSTD.jar
  e58ee14c (LOCAL_ONLY) 20141203154844 macros/Red and Green Puncta Colocalization Macro .txt
  ed38cb75 (LOCAL_ONLY) 20141203155528 macros/Red and Green Puncta.txt
  3e320440 (LOCAL_ONLY) 20141203155653 macros/Red and Green.txt
  58fcc762 (LOCAL_ONLY) 20141203154402 plugins/Analyze/Colocalization_.class
  347fe260 (LOCAL_ONLY) 20120320144156 plugins/Analyze/vamp/Readme.txt
  ff4fbcde (LOCAL_ONLY) 20120320143128 plugins/Analyze/vamp/vamp2d_v2$pixel.class
  ee7434b3 (LOCAL_ONLY) 20120320143128 plugins/Analyze/vamp/vamp2d_v2.class
  56986440 (LOCAL_ONLY) 20120320143144 plugins/Analyze/vamp/vamp3d_v1$pixel.class
  25b3018b (LOCAL_ONLY) 20120320143144 plugins/Analyze/vamp/vamp3d_v1.class
  3589dfdf (LOCAL_ONLY) 20140117163049 plugins/MultiStackReg1.45_.jar
  946ffba8 (LOCAL_ONLY) 20140219144622 plugins/MultipleKymograph_.class
  54d2cdf2 (LOCAL_ONLY) 20140219144626 plugins/MultipleOverlay_.class
  ed16bdfe (LOCAL_ONLY) 20141203155059 plugins/Red and Green Puncta Colocalization Macro .txt
  bf8660d2 (LOCAL_ONLY) 20140219144630 plugins/StackDifference_.class
  d3721d57 (LOCAL_ONLY) 20140219144634 plugins/WalkingAverage_.class
  b2afde18 (LOCAL_ONLY) 20140121162540 plugins/cell_counter.jar
  774a1cc0 (LOCAL_ONLY) 20140219144638 plugins/tsp050706.txt
Comment 1 Curtis Rueden 2015-01-21 11:28:22 CST
You could try launching ImageJ from the console [1] to get more information. Or turn on Debug mode in Edit > Options > Misc, which will redirect the console to the ImageJ log. I'm guessing there is an exception happening, which is normally silently discarded [2].

Anyway, since this is a problem with an external update site, you will need to contact the maintainer of that site [3].

[1] http://imagej.net/Debugging#Launching_ImageJ_in_debug_mode
[2] https://github.com/imagej/imagej-legacy/issues/97
[3] ronneber@informatik.uni-freiburg.de
Comment 2 Curtis Rueden 2015-01-27 17:22:43 CST
*** Bug 995 has been marked as a duplicate of this bug. ***
Comment 3 Olaf Ronneberger 2015-01-29 09:24:23 CST
Hi,

It is not a problem of my plugin, but someone has put an old version of the jhdf5 library into the fiji-update site (into jars/jhdf5.jar). Now my plugin can not find the new library any more and crashes.
@Curtis: How can I find the responsible person to remove or update that old library?
Best regards,

Olaf
P.S.: A quick and dirty fix for HDF5 plugin users is to remove or rename that jhdf5-13.06.2.jar in the directory Fiji.app/jars
Comment 4 Niko Ehrenfeuchter 2015-02-04 11:23:26 CST
Hi all,

here is the corresponding stack trace from running Fiji with "--debug":

---------------------------------------
There was a problem with the class ch.systemsx.cisd.hdf5.HDF5DataTypeInformation which can be found here:
/opt/packages/Fiji.app/jars/jhdf5-13.06.2.jar
/opt/packages/Fiji.app/plugins/cisd-jhdf5-batteries_included_lin_win_mac.jar

WARNING: multiple locations found!
java.lang.NoSuchMethodError: ch.systemsx.cisd.hdf5.HDF5DataTypeInformation.isSigned()Z
	at HDF5ImageJ.dsInfoToTypeString(HDF5ImageJ.java:799)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:423)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:451)
	at HDF5_Reader_Vibez.run(HDF5_Reader_Vibez.java:167)
	at ij.IJ.runUserPlugIn(IJ.java:202)
	at ij.IJ.runPlugIn(IJ.java:166)
	at ij.Executer.runCommand(Executer.java:131)
	at ij.Executer.run(Executer.java:64)
	at java.lang.Thread.run(Thread.java:662)

java.lang.NoSuchMethodException: Could not find method ch.systemsx.cisd.hdf5.HDF5DataTypeInformation.isSigned()Z
There was a problem with the class ch.systemsx.cisd.hdf5.HDF5DataTypeInformation which can be found here:
/opt/packages/Fiji.app/jars/jhdf5-13.06.2.jar
/opt/packages/Fiji.app/plugins/cisd-jhdf5-batteries_included_lin_win_mac.jar

WARNING: multiple locations found!
java.lang.NoSuchMethodError: ch.systemsx.cisd.hdf5.HDF5DataTypeInformation.isSigned()Z
	at HDF5ImageJ.dsInfoToTypeString(HDF5ImageJ.java:799)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:423)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:451)
	at HDF5_Reader_Vibez.run(HDF5_Reader_Vibez.java:167)
	at ij.IJ.runUserPlugIn(IJ.java:202)
	at ij.IJ.runPlugIn(IJ.java:166)
	at ij.Executer.runCommand(Executer.java:131)
	at ij.Executer.run(Executer.java:64)
	at java.lang.Thread.run(Thread.java:662)

	at HDF5ImageJ.dsInfoToTypeString(HDF5ImageJ.java:799)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:423)
	at HDF5_Reader_Vibez.recursiveGetInfo(HDF5_Reader_Vibez.java:451)
	at HDF5_Reader_Vibez.run(HDF5_Reader_Vibez.java:167)
	at ij.IJ.runUserPlugIn(IJ.java:202)
	at ij.IJ.runPlugIn(IJ.java:166)
	at ij.Executer.runCommand(Executer.java:131)
	at ij.Executer.run(Executer.java:64)
	at java.lang.Thread.run(Thread.java:662)
---------------------------------------

Cheers
Niko
Comment 5 Curtis Rueden 2015-02-04 11:44:13 CST
The version of jhdf5 shipped on the Fiji update site is 13.06.2. This is a dependency brought in by the BigDataViewer:
* https://github.com/bigdataviewer/bigdataviewer-core/blob/bigdataviewer-core-1.0.2/pom.xml#L38-L41
* https://github.com/bigdataviewer/pom-bigdataviewer/blob/bigdataviewer-1.0.5/pom.xml#L52

I tried downloading the 14.12.0 release from https://wiki-bsse.ethz.ch/display/JHDF5/Download+Page, but after unpacking I see several JAR files in the lib folder, as well as a "batteries_included" uber-jar (which we definitely do not want to ship with Fiji because it embeds several unshaded dependencies including Apache Commons IO).

It would be ideal if the JHDF5 project could provide consumable Maven artifact(s) so that the BigDataViewer could be easily and cleanly upgraded to use newer versions as they are released.

@Olaf: Is such a Mavenization something you could make happen on your end? If not, what is the best way to proceed here? I can do a one-time deployment of 14.12.0 to the ImageJ Maven repository, and upgrade the core Fiji update site to use it, but I don't want to have to do that every time a new JHDF5 is released.
Comment 6 Mark Hiner 2015-02-04 12:01:08 CST
Also, there is now SCIFIO-HDF5 - https://github.com/scifio/scifio-hdf5 - which is using 13.06.2 (because that's what was on our Maven repo).

However, it looks like that jar is the unshaded uber-jar which must be fixed. Otherwise there is a clash between jhdf5 and apache-commons-io (and potentially other third-party dependencies). See also 
http://imagej.net/Frequently_Asked_Questions#How_can_I_call_ImageJ_from_my_software.3F

Two ways to fix this are:
1) Shaded uber-jar (e.g. https://github.com/scijava/jython-shaded)
2) Individual mavenized components

Once a maven artifact for 14.12.0 exists we can update SCIFIO-HDF5 and BDV to use it and update the core Fiji update site accordingly. As a bonus, the HDF5 update site wouldn't have to ship its own jhdf5 library.

Let Curtis and me know if you need further guidance/advice on resolving this issue.
Comment 7 Olaf Ronneberger 2015-02-04 14:30:44 CST
Hi Chris,

The jhdf5 library is developed by Bernd Rinn from ETH SIS (ETH Scientific IT Services). My HDF5 plugin just uses that library and Bernd was so kind to add some functionality needed for images (e.g. unsigned data types) to his library.
I don't know, wether he has time and capacity to mavenize his library, but I guess there will be not frequent updates -- So a one-time one-time deployment would be possibly the best way.
I will ask him, what he thinks,
Best regards,

Olaf
Comment 8 Curtis Rueden 2015-02-04 15:31:26 CST
> a one-time one-time deployment would be possibly the best way.

OK, I uploaded the 14.12.0 JHDF5 uber-jar [1] to the Fiji update site.

I also added management of the JHDF5 version to pom-scijava, so that all downstream components can be sure to use the same version in the future:
* https://github.com/scijava/pom-scijava/commit/1adde967eccff1c4f7631bd7113ab770d822535d

We are planning to cut some component releases on Friday, and will take care of updating scifio-hdf5 and bigdataviewer-core to both depend on jhdf5 14.12.0 henceforth.

So this issue should now be resolved. And Olaf, you should be able to remove your shadowed version of jhdf5 from your HDF5 update site, if you wish.

[1] Regarding possible dependency conflicts surrounding the uber-jar: everything "accidentally" works at the moment because the only third party dependencies included in the JHDF5 uber-jar are Apache Commons I/O and Apache Commons Lang. Fiji does not ship Commons I/O otherwise, so that is not a clash (yet), and the version of Commons Lang shipped by Fiji is Commons Lang 3 which changed its package prefix precisely to avoid such version clashes in some cases. So for the time being, we'll tolerate the dependency bundling inside jhdf5. Would be nice to make it proper someday, though.
Comment 9 Tobias Pietzsch 2015-02-05 05:45:40 CST
Hi,

thanks for the quick resolution Curtis!

I will try to make a non-uber-jar mavenized jhdf5 version when I find the time. I'll remove the apache-commons stuff and include that as a pom dependency. It would be nice to still have "batteries-included", i.e. all the native stuff. I don't think it is required to split that up further; what do you think?

Olaf, interesting to here about added support for unsigned data types! So far I have worked around this in BigDataViewer by pretending my ushorts would be shorts... Could you quickly point me to the relevant additions in the API doc?

all the best,
Tobias
Comment 10 Olaf Ronneberger 2015-02-05 06:21:13 CST
Here is some code snipplet from my HDF5ImageJ class. I translate the data type into strings (e.g. "uint8") to make the comparison easier:

IHDF5ReaderConfigurator conf = HDF5Factory.configureForReading(filename);
conf.performNumericConversions();
IHDF5Reader reader = conf.reader();
HDF5DataSetInformation dsInfo = reader.object().getDataSetInformation(dsetName);
HDF5DataTypeInformation dsType = dsInfo.getTypeInformation();
String typeText = "";
    
if (dsType.isSigned() == false) {
   typeText += "u";
}
    
switch( dsType.getDataClass())
{
  case INTEGER:
    typeText += "int" + 8*dsType.getElementSize();
    break;
  case FLOAT:
    typeText += "float" + 8*dsType.getElementSize();
    break;
  default:
    typeText += dsInfo.toString();
}
Comment 11 Olaf Ronneberger 2015-02-05 06:24:15 CST
oh and the rest is straight forward, e.g. use:

byte[] rawdata = reader.uint8().readMDArray(dsetName).getAsFlatArray();

best regards,
Olaf
Comment 12 Tobias Pietzsch 2015-02-05 06:25:22 CST
Cool. Thanks, Olaf!
Comment 13 Niko Ehrenfeuchter 2015-02-05 07:38:47 CST
Thanks for the quick solution everyone! Tested it with various HDF5 files on Linux and Windows, works fine :)

I'll try to help Olaf in convincing Bernd Rinn to mavenize the JHDF5 library.
Comment 14 Curtis Rueden 2015-02-05 07:45:22 CST
Thanks everyone. A couple of quick comments:

1) What would be most helpful relating to JHDF5 and Maven would be for JHDF5 to be available somehow as Maven artifacts. That does _not_ necessarily mean it needs to switch to using Maven as a build system. There are various other ways of accomplishing this: e.g., hook up Ivy to Ant, or use Gradle to build. Personally I like using Maven to build, but for those who don't, all that is necessary is to somehow make available the JAR files which already exist.

2) The only packaging change I would make is, as I said earlier, to shade or eliminate the embedded Commons I/O and Commons Lang dependencies within the "batteries included" uber-jar. (I agree with Tobi that we don't have to split up the first party functionality.)

3) I did not actually test that BDV still works with the 14.12.0 JAR. Sorry for not doing that, Tobi. Hopefully all is well -- I got the impression that 14.12.0 is only new functionality, and still backwards compatible with 13.06.2. But it would be good to be certain.
Comment 15 Curtis Rueden 2015-02-05 07:47:08 CST
Oh, one more thing I forgot! Olaf: as Mark said, we now have a SCIFIO HDF5 format, written by Henry Pinkard, at: https://github.com/scifio/scifio-hdf5. But right now it only has a writer, not a reader, and only to (IIUC) an Imaris-specific flavor of HDF5.

It would super awesome if your HDF5 reader plugin could become part of the scifio-hdf5 project, so that all tools that use SCIFIO (including ImageJ2) would gain automatic support for reading HDF5. Then we wouldn't need a separate update site for this anymore.
Comment 16 Tobias Pietzsch 2015-02-05 07:52:52 CST
@Curtis: I verified that BDV works with the new jhdf5 version. Let me know when you add a managed version to pom-imagej. Then I'll remove the explicit version from pom-bigdataviewer.

I'm also currently adding support for imaris files to BDV. Nice to see that everything seems to be converging!
Comment 17 Niko Ehrenfeuchter 2015-02-05 07:57:45 CST
On a side-note, there is quite some demand for reading HDF5 based images with the Bio-Formats library, e.g. the SVI (Huygens) flavour:

https://trac.openmicroscopy.org.uk/ome/ticket/4104

Right now, we're using Olaf's Fiji plugin to read the SVI .h5 files, which seem to store the pixel data in one single 5-dimensional array with CTZYX layout.

I have to say that I'd absolutely love to see the JHDF5 plugin merge with scifio-hdf5, that would be amazing!
Comment 18 Curtis Rueden 2015-02-05 08:04:38 CST
@Niko: If scifio-hdf5 gains HDF5 reading abilities, then all we'll need in order to also have Bio-Fomrats support is to resurrect our SCIFIO -> Bio-Formats bridge in the scifio-bf-compat component. Right now it enables SCIFIO to read/write all BF formats, but we also want vice versa. We had a working prototype a while back -- it just needs a little  spit and polish. We will prioritize it as soon as anybody needs it.