In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
In this article Xiaobian for you to introduce in detail "python3 how to use hdf5 files", the content is detailed, the steps are clear, the details are handled properly, I hope this "python3 how to use hdf5 files" article can help you solve your doubts, following the editor's ideas slowly in-depth, let's learn new knowledge together.
What is a HDF5 file?
To quote a wave of Wikipedia, "hierarchical data format (Hierarchical Data Format:HDF) is a set of file formats (HDF4,HDF5) designed to store and organize large amounts of data."
It was originally developed by Tencent App Center, a US national supercomputing, and is now supported by the non-profit association HDF Group. Its mission is to ensure the continuous development of HDF5 technology and the continued accessibility of data stored in HDF. ".
HDF5 has a series of excellent features, which make it particularly suitable for the storage and operation of a large number of scientific data, such as it supports a large number of data types, flexible, universal, cross-platform, scalable, efficient Imax O performance, support for almost unlimited (up to EB) single file storage, etc.
How do I view the hdf5 file in Linux? H6ls info.h6# key1 Dataset {10000} # key2 Dataset {10000 h6py 5} # key3 Dataset {20000 Magi 30} h6py Module
We can use Python to easily read and write hdf5 files, and the most commonly used module is h6py. Here is how to install and use it:
The installation module pip install h6pypip install numpy# numpy is usually used as a summary of h6py in conjunction with:
"an HDF5 file is a container that stores two types of objects, which are:
Dataset: a collection of data similar to an array; a gropp;-like container that can contain one or more dataset and other group.
A HDF5 file starts with a group named "/", and all dataset and other group are included under this group. When operating a HDF5 file, if you do not explicitly specify the dataset of group, the default is "/". In addition, similar relative file path group names are relative to "/".
Both the dataset and group of HDF5 files can have descriptive metadata called attribute.
Using h6py to manipulate HDF5 files, we can use group like directories, dataset like numpy arrays, and attributes like dictionaries, which is very convenient and easy to use. "
Write to the hdf5 file import h6pyimport numpy as np# if you want to create datasetf = h6py.File ('info.h6', 'w') values1 = np.arange (12) .reshape (4,3) values2 = np.arange (20) under the root group. Reshape (4,5) f.create_dataset (name='key1', data=np.array (values1, dtype='int64')) f.create_dataset (name='key2', data=np.array (values2) Dtype='int64') # if you want to create a group (directory) # then specify the groupf.create_group ('/ dir1') f.create_group ('/ dir1/dir2') data= np.arange (6) .reshape (3,2) f.create_dataset ('/ dir1/dir2', data=data) # and don't forget to close the file f.close () and read the hdf5 file import h6pywith h6py.File (info.h6) 'r') as f: values1 = f ['key1']. Value values2 = f [' key2'] .value traverses the hdf5 file import h6pyimport numpy as npf = h6py.File ('train/e1_1.hdf5') key = "" for k in f.keys (): key = kd = f [key] print (d) a = np.ones (d.shape) d.read_direct (a) print (a) f.close () here This "python3 how to use hdf5 file" article has been introduced, want to grasp the knowledge of this article also need to practice and use to understand, if you want to know more about the article, welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.