Skip to main content
-3 votes
0 answers
45 views

When I've tried to set this up in aws, I get error messages that don't make sense. The firehose stream is listed in this output: aws firehose list-delivery-streams --region us-east-2 but SES will ...
lf215's user avatar
  • 1,195
0 votes
2 answers
62 views

Given a CloudWatch -> Firehose -> Splunk flow, where Firehose passes incoming log records to a lambda, often the return from the lambda is larger than the allowed 6MB. I've captured the payload ...
Josh M.'s user avatar
  • 28.1k
0 votes
0 answers
35 views

I am trying to integrate NewRelic with AWS account, i followed the exact steps Provided by NewRelic, i created AWS firehose, and AWS cloudwatch streams, simply everything working fine, except from ...
XP_2600's user avatar
  • 75
0 votes
1 answer
348 views

I'm working on a task where I need to send AWS ECS EC2 logs to opensearch. For ECS service logs I've created Fluentbit daemon service which helps in sending service logs to opensearch via firehose. ...
Prathyush Peettayil's user avatar
1 vote
0 answers
240 views

I’m running into an issue while testing data ingestion into an Iceberg table in the us-east-1 region, so I’d like to ask for some help. Here are the services I’m currently using: Lake Formation + ...
Jmob's user avatar
  • 129
0 votes
0 answers
34 views

Using only AWS infracode, is it possible for a stream of events to trigger updates and deletions of existing matched records in a table or database? In other words, are components like EventBridge, ...
benjimin's user avatar
  • 5,090
0 votes
0 answers
36 views

Currently, there is a aws_kinesis_firehose_delivery_stream resource created with terraform. resource "aws_kinesis_firehose_delivery_stream" "test_firehose" { name = "...
Gi Yeon Shin's user avatar
0 votes
1 answer
525 views

I have Lambda function that direct put's JSON strings to a Firehose stream to deliver batches of records to S3, and I wish to deliver these records as compressed .gz files. However, despite having ...
mmarion's user avatar
  • 1,105
1 vote
1 answer
268 views

Terraform : In the firehose stream with S3 as the destination, what is the terraform equivalent parameter for enabling "New line delimiter". FYI, There is no dynamic partition, no lambda. ...
hari krishnan's user avatar
2 votes
0 answers
210 views

We have been trying Firehose for Iceberg Tables. The source is Kinesis stream attached to DynamoDB tables with some Lambda processing in between. Table has been successfully filled by Firehose, but ...
Martin Macak's user avatar
  • 3,842
0 votes
1 answer
192 views

I have a case where I am taking some Licensekey as input to a cloud formation template, then I have to use this to create a firehose delivery stream. This is working fine,because I can just pass the ...
Himanshu Rai's user avatar
0 votes
1 answer
70 views

I have enabled a Kinesis stream on my DynamoDB table, which is then used by Firehose to back up the data to S3. Now, I want to enable it for multiple tables, but I need a separate folder for each ...
Karmesh Duggar's user avatar
1 vote
2 answers
810 views

Using AWS Firehose to ingest data into an Iceberg table managed by AWS Glue, I'm unable to insert timestamp data. Firehose I'm trying to insert data using the following script: json_data = json.dumps( ...
Crolle's user avatar
  • 832
0 votes
1 answer
277 views

The following AWS CLI command works fine: aws firehose put-record --delivery-stream-name 52N-STA-DF-ICBG --cli-binary-format raw-in-base64-out --record='{"Data":"{\"ADF_Metadata\&...
Humaid Kidwai's user avatar
1 vote
0 answers
372 views

I am working on a Disaster recovery of MSK where I use AWS Firehose to stream data directly into S3 bucket from MSK cluster and it is very straight forward. I could see raw data in S3 whenever new ...
Prasa2166's user avatar
  • 489

15 30 50 per page
1
2 3 4 5
42