one option would be to write your CSV payload to blob storage, then ingest that blob into your target table, by:
- using a "queued ingestion" client in one of the client libraries: https://learn.microsoft.com/en-us/azure/kusto/api/
- note that the .NET ingestion client library also provides you with methods to
IngestFromStream or IngestFromDataReader, which handle writing the data to intermediate blob storage so that you don't have to
or by
- issuing an
.ingest command: https://learn.microsoft.com/en-us/azure/kusto/management/data-ingestion/ingest-from-storage. though using "direction ingestion" is less recommended for Production volumes
another option (not recommended for Production volume), would be using the .ingest inline (AKA "ingest push") option: https://learn.microsoft.com/en-us/azure/kusto/management/data-ingestion/ingest-inline
for example:
.create table sample_table (a:string, b:int, c:datetime)
.ingest inline into table sample_table <|
hello,17,2019-08-16 00:52:07
world,71,2019-08-16 00:52:08
"isn't, this neat?",-13,2019-08-16 00:52:09
which will append the above records to the table:
| a | b | c |
|-------------------|------|-----------------------------|
| hello | 17 | 2019-08-16 00:52:07.0000000 |
| world | 71 | 2019-08-16 00:52:08.0000000 |
| isn't, this neat? | -13 | 2019-08-16 00:52:09.0000000 |