?>Array ( [id] => 182 ) High-Volume IoT Sensor Data Ingestion with Partitioning and Redundancy - WeAreCAS
table addTable

High-Volume IoT Sensor Data Ingestion with Partitioning and Redundancy

Scénario de test & Cas d'usage

Contexto empresarial

A manufacturing company ingests millions of data points from IoT sensors on its assembly line machines. To enable rapid failure analysis and predictive modeling, the data must be loaded into CAS with high availability (redundancy) and partitioned by machine for query performance. The table must also be accessible to data science teams across the organization.
Preparación de datos

Generation of a large dataset simulating sensor readings from multiple machines over time.

¡Copiado!
1DATA WORK.SENSOR_DATA;
2 FORMAT timestamp datetime20.;
3 call streaminit(789);
4 DO machine_id = 1 to 10;
5 DO i = 1 to 50000; /* 500,000 total rows */
6 sensor_id = mod(i, 5) + 1;
7 timestamp = '20NOV2025:00:00:00'dt + (i * 1.5);
8 temperature = 70 + (rannor(0) * 10);
9 pressure = 30 + (rannor(0) * 2);
10 OUTPUT;
11 END;
12 END;
13RUN;

Étapes de réalisation

1
Upload the large sensor dataset with partitioning by 'machine_id', 2 redundant copies for fault tolerance, and promotion for global access.
¡Copiado!
1PROC CAS;
2 TABLE.addTable /
3 caslib='casuser'
4 TABLE='IOT_SENSOR_LOGS'
5 datatable='SENSOR_DATA'
6 promote=TRUE
7 copies=2
8 partition={'machine_id'}
9 vars={{name='machine_id', type='int64'},
10 {name='sensor_id', type='int64'},
11 {name='timestamp', type='double', FORMAT='DATETIME20.'},
12 {name='temperature', type='double'},
13 {name='pressure', type='double'}};
14RUN;
15QUIT;
2
Verification step: Use table.tableDetails to confirm the table is promoted, partitioned, and has the correct number of copies.
¡Copiado!
1PROC CAS;
2 TABLE.tableDetails / caslib='casuser' name='IOT_SENSOR_LOGS';
3RUN;
4QUIT;

Resultado esperado


A global-scope table named 'IOT_SENSOR_LOGS' is created in 'casuser'. The tableDetails result should show that the table is partitioned by 'machine_id' and has 'Number of Copies' set to 2. The 'Scope' should be 'Global', making it accessible to other sessions. The total number of rows should be 500,000.