Introduction to oneAPI Containers

ID 标签 688826
已更新 7/21/2021
版本 1.0
公共

author-image

作者

Introduction

Intel® oneAPI products deliver the tools needed to deploy applications and solutions across scalar, vector, matrix and spatial architectures. oneAPI Containers allow users to set up and configure environment for building, deploying, profiling high-performance data-centric applications across diverse architectures. The following oneAPI Container images contain Intel® oneAPI toolkits and are available at https://software.intel.com/content/www/cn/zh/develop/tools/containers.html :

  • Intel® oneAPI Base Toolkit Container is used to build high-performance, data-centric applications across diverse architectures.
  • Intel® oneAPI HPC Toolkit Container is used to develop, analyze, optimize and scale HPC applications.
  • Intel® oneAPI DL Framework Developer Toolkit Container is used to design and build framework.
  • Intel® oneAPI Runtime Libraries is used to access runtime versions of the oneAPI libraries for deployment with your applications.
  • Intel® oneAPI AI Analytics Toolkit Container is used to develop AI applications such as deep learning, training, inference, and data analytics.
  • Intel® oneAPI IoT Toolkit Container is used to build high-performance, efficient, reliable solutions running at the network’s edge.

This tutorial demonstrates how to use oneAPI Container Image for Intel® oneAPI Base Toolkit to to build and run an oneAPI application that offloads the computation to a GPU in the Linux* machine.

Setup Environment 

In this tutorial, the host system is powered by Intel® Iris™ Pro Graphics 580, an Intel® Core™ i7 processor and run Ubuntu* 20.04. Note that Intel Iris Pro Graphics 580 is a version of Intel® Processor Graphics Gen9 that are supported by Intel® oneAPI Toolkits

Install the docker engine:
$ sudo apt-get update
$ sudo apt-get install docker-ce docker-ce-cli containerd.io

Start the Docker deamon
$ sudo systemctl start docker

If you are behind a proxy server, you need to create a system directory for the Docker service, and the http-proxy.conf file that defines the HTTP_PROXY and NO_PROXY environment variables:
$ sudo mkdir /etc/systemd/system/docker.service.d
$ more /etc/systemd/system/docker.service.d/http-proxy.conf
[Service]
Environment="HTTP_PROXY=http://<yourproxy>:<yourportport>/"
Environment="NO_PROXY=localhost,127.0.0.1"

Flush the change and restart Docker:
$ sudo systemctl daemon-reload
$ sudo systemctl show --property Environment docker
$ sudo systemctl restart docker

Get the Intel® oneAPI Base Toolkit Container image
$ image=intel/oneapi-basekit
$ sudo docker pull "$image"

List that Docker image  
$ sudo docker images
REPOSITORY                 TAG       IMAGE ID       CREATED         SIZE
intel/oneapi-basekit       latest    52043cd2453e   2 months ago    21.7GB

 

Launch the Intel® oneAPI Base Toolkit Container

After successfully getting the Intel® oneAPI Base Toolkit Container image, you launch a Docker container using that image. Note that the container http_proxy and https_proxy environment variables are setting to those of the host.
$ sudo docker run --rm \
  --env http_proxy=${http_proxy} \
  --env https_proxy=${https_proxy} \
  --device=/dev/dri \
  -it $image \
  /bin/bash

root@2348ef11b879:/#

 
The above command starts the container in foreground mode. In the terminal session, you may want to verify the GPU exists. One possible way to verify the GPU is to install the pciutils package and use the lspci command to search for GPU:

root:/# apt-get update && apt-get install pciutils
root:/# lspci | grep VGA
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Polaris 22 XT [Radeon RX Vega M GH] (rev c0)

Get the Code Sample 

Verify the DPCPP compiler is already loaded in the container:
root@2348ef11b879:/# dpcpp --version
Intel(R) oneAPI DPC++ Compiler 2021.2.0 (2021.2.0.20210317)
Target: x86_64-unknown-linux-gnu
Thread model: posix
InstalledDir: /opt/intel/oneapi/compiler/2021.2.0/linux/bin

 

Install git utility and download oneAPI code samples available at https://github.com/oneapi-src/oneAPI-samples
root:/# apt-get install git
root:/# git clone https://github.com/oneapi-src/oneAPI-samples.git
root:/# ls oneAPI-samples/
AI-and-Analytics  DirectProgramming  README.md  third-party-programs.txt
CODEOWNERS        Libraries          Tools
CONTRIBUTING.md   License.txt        common

Build and run a sample example named Nbody according to the instruction in the sample README file:
root:/# cd oneAPI-samples/DirectProgramming/DPC++/N-BodyMethods/Nbody
root:/# ls
CMakeLists.txt  Nbody.vcxproj          README.md    third-party-programs.txt
License.txt     Nbody.vcxproj.filters  sample.json
Nbody.sln       Nbody.vcxproj.user     src


root:/# mkdir build
root:/# cd build
root:/# cmake ..
root:/# make
root:/# make run
===============================
 Initialize Gravity Simulation
 nPart = 16000; nSteps = 10; dt = 0.1
------------------------------------------------
 s       dt      kenergy     time (s)    GFLOPS
------------------------------------------------
 1       0.1     26.405      0.36868     20.138
 2       0.2     313.77      0.025127    295.47
 3       0.3     926.56      0.025196    294.67
 4       0.4     1866.4      0.024744    300.04
 5       0.5     3135.6      0.024963    297.41
 6       0.6     4737.6      0.024947    297.6
 7       0.7     6676.6      0.024829    299.01
 8       0.8     8957.7      0.024866    298.58
 9       0.9     11587       0.024813    299.21
 10      1       14572       0.025069    296.15


Summary

Containers allows developers to deploy software and all its dependency in packages. oneAPI Container allows developers to set up and configure environment for building, deploying, profiling high-performance data-centric applications across CPU, GPU and FPGA. This tutorial shows how to run the Intel® oneAPI Base Toolkit container. It also shows how to download code samples, and then compile / run a code sample using the DPCPP compiler available in the image.
 

Resources


 

"