Keyphrases
In-network Computing
100%
Convolutional Neural Network
100%
Deep Neural Network
100%
Split Computing
100%
Neural Network Inference
100%
Mobile Devices
75%
Computing Node
75%
Convolutional Neural Network Model
50%
Placement Scheme
50%
Inference Latency
50%
Edge Server
50%
Optimization Problem
25%
Promising Technology
25%
Memory Resource
25%
Arithmetic Operations
25%
Low Latency
25%
Evaluation Results
25%
Resource Constraints
25%
Simple Arithmetic
25%
Multiple Layers
25%
Computing Approach
25%
Device Server
25%
Network Device
25%
Packet Processing
25%
Limited Computing Resources
25%
Max Pooling Layer
25%
Edge Switch
25%
Device Edge
25%
Complex Deep Neural Networks
25%
Computational Tasks
25%
Programmable Switches
25%
Mobile Edge
25%
Computer Science
Neural Network Model
100%
Convolutional Neural Network
100%
Network Inference
100%
Deep Neural Network
66%
Mobile Device
50%
Edge Server
33%
Optimization Problem
16%
Memory Resource
16%
Arithmetic Operation
16%
Complex Process
16%
Evaluation Result
16%
Resource Constraint
16%
Computing Resource
16%
Packet Processing
16%
Computational Task
16%