Computer-science_A-level_Cie
-
computers-and-components6 主题
-
logic-gates-and-logic-circuits2 主题
-
central-processing-unit-cpu-architecture6 主题
-
assembly-language-4 主题
-
bit-manipulation1 主题
-
operating-systems3 主题
-
language-translators2 主题
-
data-security3 主题
-
data-integrity1 主题
-
ethics-and-ownership3 主题
-
database-concepts3 主题
-
database-management-systems-dbms-1 主题
-
data-definition-language-ddl-and-data-manipulation-language-dml1 主题
-
computational-thinking-skills1 主题
-
algorithms14 主题
-
data-types-and-records2 主题
-
arrays2 主题
-
files1 主题
-
introduction-to-abstract-data-types-adt1 主题
-
programming-basics1 主题
-
constructs2 主题
-
structured-programming1 主题
-
program-development-life-cycle2 主题
-
program-design-2 主题
-
program-testing-and-maintenance3 主题
-
user-defined-data-types1 主题
-
file-organisation-and-access-3 主题
-
floating-point-numbers-representation-and-manipulation3 主题
-
protocols2 主题
-
circuit-switching-packet-switching1 主题
-
processors-parallel-processing-and-virtual-machines5 主题
-
boolean-algebra-and-logic-circuits4 主题
-
purposes-of-an-operating-system-os3 主题
-
translation-software3 主题
-
encryption-encryption-protocols-and-digital-certificates3 主题
-
artificial-intelligence-ai4 主题
-
recursion1 主题
-
programming-paradigms4 主题
-
object-oriented-programming7 主题
-
file-processing-and-exception-handling2 主题
-
data-representation5 主题
-
multimedia3 主题
-
compression2 主题
-
networks-and-the-internet11 主题
parallel-computing
Massively parallel systems
What are massively parallel computers?
-
Massively parallel computers are systems made up of thousands of processors working simultaneously to solve a single large problem
-
Each processor executes part of a program, and results are combined to produce the final output
-
They are designed to tackle complex, large-scale tasks in fields like
-
Scientific research
-
Weather simulation
-
Cryptography
-
AI
-
Key characteristics
|
Feature |
Description |
|---|---|
|
Thousands of processors |
Many processors are connected and work together to execute different parts of the same task |
|
Shared or distributed memory |
Each processor may have its own memory or access shared memory resources |
|
High-speed interconnects |
Processors are linked via fast communication pathways to share results |
|
Data parallelism |
Many data items can be processed at once — often using SIMD or MIMD |
|
Task parallelism |
Different processors may perform different tasks on different data sets |
|
Specialised software |
Requires software written to distribute the work efficiently across processors |
|
Tightly coupled |
Processors depend on one another and work collaboratively as a single system |
Massively parallel vs cluster computers
|
Massively parallel computers |
Cluster computers |
|---|---|
|
Thousands of processors form a single tightly integrated system |
Multiple independent systems networked together |
|
Processors communicate continuously via shared architecture |
Communication occurs via network, often loosely coupled |
|
Acts like one machine with distributed processing |
A group of co-operating systems (can be SIMD-based) |
|
Higher performance, used for supercomputing tasks |
More general-purpose or batch processing systems |
Link to computer architectures
-
Massively parallel systems often use:
-
SIMD: Apply one instruction to many data points simultaneously
-
MIMD: Run different instructions on different data, fully independently
-
Responses