Running MAESTRO¶
Parameters Input to MAESTRO¶
--HW_file='data/hw/accelerator_1.m' : Specify the Hardware parameters file
--Mapping_file='data/mapping/Resnet50_kcp.m' : Specify the target dataflow and layer description file
--print_res=true/false : If set true, MAESTRO prints out detailed cost information to the screen
--print_res_csv_file=true/false : If set true, MAESTRO prints out a csv file that contains various statistics
--print_log_file=true/false : If set true, MAESTRO prints out a log file that contains various information of detailed computation patterns to "log.txt"
Hardware Description File¶
The Hardware Description file is provided as an input to MAESTRO as:
--HW_file='data/hw/accelerator_1.m' : Specify the Hardware parameters file
This file contains Hardware Description as follows:
num_pes: 1024 |
Number of PEs |
l1_size_cstr: 256 |
l1 buffer size constraint (Optional) |
l2_size_cstr: 4096 |
l2 buffer size constraint (Optional) |
noc_bw_cstr: 64 |
NoC bandwidth constraint (Optional) |
offchip_bw_cstr: 1 |
Off-chip memory bandwidth constraint (Optional) |
Note
For more information on the Hardware Supported, please see Hardware Supported.
Note
All the constraints are optional. If not specfied, MAESTRO will assume infinite resources and compute the required amount of resources, which are reported in the res.csv file.
Note
The l1_size_cstr and l2_size_cstr are suggested to be used only in mapper. Usual users do not need to specify them.
When l1_size_cstr and l2_size_cstr are specified, MAESTRO will check if the constraints are met. If not, MAESTRO will print out warning message.
The required L2/L1 size will be reported in the res.csv file as — L2/L1 SRAM Size Req.
Note
When noc_bw_cstr and offchip_bw_cstr are specfied, they will be used as respective BW constraints and affect runtime.
The required NoC/off-chip BW for not BW-bounded will be reported in the res.csv file as — NoC/offchip BW Req.
Mapping Description¶
A mapping is a MAESTRO input file which contains a DNN model and the dataflow for each layer.
Network MobileNetV2 {
Layer CONV1 {
Type: CONV
Stride { X: 2, Y: 2 }
Dimensions { K: 32, C: 3, R: 1, S: 1, Y:224, X:224 }
Dataflow {
SpatialMap(1,1) K;
TemporalMap(64,64) C;
TemporalMap(Sz(R),Sz(R)) R;
TemporalMap(Sz(S),Sz(S)) S;
TemporalMap(Sz(R),1) Y;
TemporalMap(Sz(S),1) X;
Cluster(64, P);
SpatialMap(1,1) C;
TemporalMap(Sz(R),1) Y;
TemporalMap(Sz(S),1) X;
TemporalMap(Sz(R),Sz(R)) R;
TemporalMap(Sz(S),Sz(S)) S;
}
}
Network MobileNetV2 |
DNN MODEL NAME |
Layer CONV1 |
LAYER NAME |
Type: CONV |
TYPE OF LAYER |
Dimensions { K: 32, C: 3, R: 1, S: 1, Y:224, X:224} |
LAYER DIMENSIONS |
Dataflow |
DATAFLOW FOR THE LAYER IN MAESTRO COMPILER DIRECTIVES |
How to generate a Mapping¶
This tutorial is written to provide an easy way to generate a mapping from a PyTorch/Keras model.
Autogenerating a mapping¶
MAESTRO can be run with several mappers to automatically search for optimized mappings. For more information about the mapper, please refer to the following project pages:
Heuristics: Marvel
Optimization Methods: GAMMA
Generate a MAESTRO DNN Model file from a Pytorch/Keras model¶
We use tesorflow 2.0.0 and torch 1.3.0 and torchvision 0.4.1 with python 3.7.4.
cd tools/frontend
Check the messages from the help for the future reference
python frameworks_to_modelfile_maestro.py --help
python frameworks_to_modelfile_maestro.py --api_name pytorch --input_size 3,224,224 --model mobilenet_v2 --outfile dnn_model.m
--api_name: the API name, choose from "pytorch, keras"
--input_size: the input image size of the first layer
--model: the model name from torchvision.models (or tensorflow.keras.applications)
TO use a custom model, enter custom for this argument.
--custom: Enter the custom network python file name here.
The file should have a function whose name is same as the file name and returns the model.
(This option is working only for keras now)
--outfile: the MAESTRO model output file name
The MAESTRO DNN Model, dnn_model.m, will be generated in ../../data/model
Generate a MAESTRO Mapping file with the MAESTRO DNN Model file and specific dataflow¶
Check the messages from the help for the future reference
python modelfile_to_mapping.py --help
python modelfile_to_mapping.py --model_file dnn_model.m --dataflow kcp_ws --outfile out.m
--model_file: The model file supported by MAESTRO as specified by the user or generated by the above given script.
--dataflow: the dataflow for each layer, choose from "ykp_os, kcp_ws, xp_ws, rs"
--outfile: the MAESTRO Mapping output file
The mapping file, out.m, will be generated in ../../data/mapping
Run MAESTRO with the generated mapping¶
Go back to the maestro directory
cd ../../
Change the contents of “run_example.sh” to use the mapping file generated
--Mapping_file='data/mapping/out.m'
Run MAESTRO
./run_example.sh