In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Tf_serving-WORKSPACE
| |
-tensorflow-serving/----BUILD
| | |
| |-workspace.bzl |
| | |
| |-example/-BUILD |
| | |
| |-imagenet_lsvrc_2015_synsets.txt |
| | |
| |-imagenet_metadata.txt |
| | |
| |-inception_client.cc |
| | |
| |-inception_client.py |
| | |
| |-inception_k8s.yaml |
| | |
| |-inception_saved_model.py |
| | |
| |-mnist_client.py |
| | |
| |-mnist_iput_data.py |
| | |
| |-mnist_saved_model.py |
| |
-tensorflow/-BUILD
| | |
| |-WORKSPACE |
| | |
| |-tensorflow/-BUILD |
| | |
| |-workspace.bzl |
| |
-tf_models/----WORKSPACE
| |
-official/
| |
-tutorials/
| |
-research/-inception/-WORKSPACE
| |
-inception/-BUILD
| |
-inception_train.py
| |
-inception_model.py
| |
-inception_eval.py
| |
-inception_distributed_train.py
Interpretation:
The project folder name is "tf_serving" and the WORKSPACE file is placed in the root directory of the "tf_serving" project folder
*
Tf_serving/WORKSPACE interpretation:
*
# declare the workspace name, consistent with the project name workspace (name = "tf_serving") # declare the name and path local_repository of the local warehouse 'tensorflow' (name = "org_tensorflow", path = "tensorflow",) # declare io_bazel_rules_closurehttp_archive (…) # add a new Tensorflow Serving dependency to workspace.bzlload ("/ / tensorflow_serving:workspace.bzl", "tf_serving_workspace") tf_serving_workspace () # specify the bazel minimum required version load ("@ org_tensorflow//tensorflow::workspace.bzl", "check_version") check_version ("0.5.4")
*
Tf_serving/tensorflow_serving/workspace.bzl interpretation:
*
# tensorFlow Serving external dependencies are loaded into the WORKSPACE file load ('@ org_tensorflow//tensorflow::workspace.bzl', 'tf_workspace') # here are external dependencies for all Tensorflow Serving. # workspace_dir is the absolute path of Tensorflow Serving repo, if it is a submodule# connection The path form should be'_ workspace_dir__ + "serving" 'def tf_serving_workspace (): native.new_local_repository (name = "inception_model", path = "tf_models/research/inception", build_file = "tf_models/research/inception/inception/BUILD",) tf_workspace (path_prefix = "", tf_repo_name = "org_tensorflow") # gRPC depends on native.bind (name = "libssl") Actual = "@ boringssl//:ssl",) native.bind (name = "zlib", actual = "@ zlib_archive//:zlib")
*
Tf_serving/tensorflow_serving/BUILD interpretation:
*
# Tensorflow serving describes package (default_visibility= ["/ tensorflow_serving:internal"],) licenses (["notice"]) exports_files (["LICENSE"]) # Open Source tag package_group (name = "internal", package = ["/ / tensorflow_serving/...",],) filegroup (name = "all_files", srcs = glob (["* * /"], exclude = ["* * / METADATA", "* * / OWNERS", "g3doc/sitemap.md",],),)
*
Tf_serving/tensorflow/tensorflow/workspace.bzl interpretation:
*
# Tensorflow externally dependent load (...) that can be loaded in a WORKSPACE file Def _ is_windows ():... Def _ get_env_var ():... # parsing the bazel version string def _ parse_bazel_version (): from 'native.bazel_version' # check that you are using the specified version of bazeldef check_version ():... # support Tensorflow as the temporary workspace of submodule def _ temp_workaround_http_archive_impl ():... # if you exit with non-zero code, execute the command that specifies the parameter and call 'fail'def _ execute_and_check_ret_code ():... # apply the patch file def _ apply_patch (): … in the root directory of the repository # download the repository, and apply the patch def _ patched_http_archive_impl (): … in the root node # if the Tensorflow connection is submodule,path_prefix and is no longer used # tf_repo_name is considering def tf_workspace (): …
*
Tf_serving/tensorflow/tensorflow/BUILD interpretation:
*
Package (default_visibility = [": internal"]) licenses (["notice"]) exports_files (["LICENSE", "ACKNOWLEDGMENTS", # leakr file for / / third_party/cloud_tpu "leakr_badwords.dic", "leakr_badfiles.dic",]) load ("/ / tensorflow:tensorflow.bzl", "tf_cc_shared_object") load ("/ / tensorflow/core:platform/default/build_config.bzl", "tf_additional_binary_deps") ) # various config settingconfig_setting () package_group () filegroup () py_library () filegroup (name = "all_opensource_files", data = [": all_files", "/ / tensorflow/c:all_files", "/ / tensorflow/cc:all_files",.] , visibility = [': _ _ subpackages__'],) load ("/ / third_party/mkl:build_defs.bzl", "if_mkl",) filegroup (name = "intel_binary_blob", data = if_mkl (["/ / third_party/mkl:intel_binary_blob",],),) filegroup (name = "docs_src", data = glob (["docs_src/**/*.md"]),) tf_cc_shared_object (.)
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.