Keyphrases
Single-hop
79%
Optical Fiber
68%
Transmitter
65%
Multichannel ALOHA
61%
Redundancy
59%
Distributed Storage
48%
ALOHA Network
44%
Caching
42%
Video on Demand
37%
Shared Memory
35%
Transceiver
32%
Parallelization
32%
Single Photon Avalanche Diode
32%
Uniform Traffic
29%
Muzzle Flash Detection
28%
Judicious Use
28%
InfiniBand
27%
CPU Architecture
27%
Broadcast Channel
27%
Controller
27%
Parallel Computing
27%
Distributed Computation
27%
Traffic Capacity
27%
Storage Server
27%
Error Correction Codes
26%
Reconfiguration
25%
Sender
25%
Coding Scheme
25%
Memory Access
25%
Disk-shaped
25%
High Performance
25%
Inbound
23%
Star Coupler
23%
Power Budget
23%
Storage Capacity
23%
Multilevel Cell
23%
Slotted ALOHA
23%
Performance Improvement
23%
ALOHA
23%
Horizontal Line
21%
Non-intersecting
21%
Vertical Line
21%
Fault-tolerant
21%
NP-complete
21%
Many-core Processor
21%
Index Coding
21%
Processing Element
21%
Security Support
21%
Scalable Security
21%
Disjoint
21%
Computer Science
Computer Hardware
100%
Shared Memories
86%
Data Center
43%
Retransmission
43%
Video-on-Demand
41%
Channel Capacity
34%
Traffic Pattern
32%
Access Schemes
32%
Storage Capacity
30%
Load Balancing
30%
Propagation Delay
30%
Communication Network
27%
Parallel Computation
27%
Many-Core
27%
Distributed Computation
27%
Storage Server
27%
Memory Access
27%
Traffic Requirement
25%
Disk Drive
25%
fault-tolerance
25%
Transmission Rate
25%
Storage System
24%
Broadcast Channel
24%
Inbound Traffic
23%
Public Cloud
21%
Fault Tolerant
21%
Deadlock Avoidance
21%
Interference Mitigation
21%
intercell interference
21%
Supernode
21%
Faulty Processor
21%
Packet Radio Network
21%
Networks on Chips
21%
Interconnect Delay
21%
Core Processor
21%
Processing Element
21%
Memory Array
21%
Transaction Processing
21%
Covariance Matrix
21%
Local Area Network
21%
deduplication
21%
Side Information
21%
Rectangular Array
21%
Multiplexing
21%
prefetching
21%
Spread Spectrum
21%
Code Division Multiple Access
21%
Field Programmable Gate Arrays
21%
Error-correcting codes
21%
Data Parallelism
20%