Neural networks samples included with the Intel® Data Analytics Acceleration Library (Intel® DAAL) are designed to show how to use this library to create most common neural network topologies such as LeNet, GoogleNet, AlexNet, VGG-19, ResNet-50 in a C++ application.
Unzip Intel® DAAL samples archive to your working directory (<sample_dir>)
You can use Intel® DAAL neural networks samples on Linux*, Windows*, and OS X* operating systems. For a detailed list of Intel® DAAL hardware and software requirements, refer to release notes of Intel® DAAL product you are using.
This sample can be used with standard MNIST dataset that can be downloaded following links below.
train-images-idx3-ubyte.gz - training set images - 55000 training images, 5000 validation images
train-labels-idx1-ubyte.gz - training set labels matching the images
t10k-images-idx3-ubyte.gz - test set images - 10000 images
t10k-labels-idx1-ubyte.gz - test set labels matching the images
Download and place uncompressed dataset in the following folder <sample_dir>\cpp\neural_networks\data prior running this sample
Before you build the sample, you must set certain environment variables that define the location of related libraries. The Intel® DAAL includes the daalvars scripts that you can run to set environment variables
For more information about setting environment variables for different product suites, refer to product user guide
To build Intel® DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the launcher command with the build parameter:
cd <sample_dir>\cpp\neural_networks
launcher.bat {ia32|intel64} build
The command creates the .\_results\ia32 or .\_results\intel64 directory and builds *.exe executables and *.exe libraries, as well as creates a log file for build results.
To run Intel DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the launcher command with the run parameter:
cd <sample_dir>\cpp\neural_networks
launcher.bat {ia32|intel64} run
Select the same architecture parameter as you provided to the launcher command with the build parameter.
For each sample, the results are placed into the .\_results\ia32\<sample name>\.res or .\_results\intel64\<sample name>\.res file, depending on the specified architecture.
To build Intel® DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the make command:
cd <sample_dir>/cpp/neural_networks
make {libia32|soia32|libintel64|sointel64}
compiler={intel|gnu}
mode=build
From the {libia32|soia32|libintel64|sointel64} parameters, select the one that matches the architecture parameter you provided to the daalvars.sh script and that has the prefix that matches the type of executables you want to build: lib for static and so for dynamic executables.
The command creates a directory for the chosen compiler, architecture, and library extension (a or so). For example: _results/intel_intel64_a.
To run Intel DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the make command in the run mode. For example, if you run the daalvars script with the intel64 target:
cd <sample_dir>/cpp/neural_networks
make libintel64 mode=run
The make command builds a static library for the Intel® 64 architecture and runs the executable.
To build Intel® DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the make command:
cd <sample_dir>/cpp/neural_networks
make {libia32|dylibia32|libintel64|dylibintel64}
compiler={intel|gnu|clang}
mode=build
From the {libia32|dylibia32|libintel64|dylibintel64} parameters, select the one that matches the architecture parameter you provided to the daalvars.sh script and that has the prefix that matches the type of executables you want to build: lib for static and dylib for dynamic executables.
The command creates a directory for the chosen compiler, architecture, and library extension (a or dylib). For example: _results/intel_intel64_a.
To run Intel DAAL neural networks C++ samples, go to the C++ neural networks samples directory and execute the make command in the run mode. For example, if you run the daalvars script with the intel64 target:
cd <sample_dir>/cpp/neural_networks
make libintel64 mode=run
The make command builds a static library for the Intel® 64 architecture and runs the executable.
Intel, and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
*Other names and brands may be claimed as the property of others.
© Copyright 2016, Intel Corporation
Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice. Notice revision #20160620 |