amazon-cloudwatch

Amazon EC2 AutoScaling CPUUtilization Alarm- INSUFFICIENT DATA

拜拜、爱过 提交于 2019-12-06 20:58:06
问题 So I've been using Boto in Python to try and configure autoscaling based on CPUUtilization, more or less exactly as specified in this example: http://boto.readthedocs.org/en/latest/autoscale_tut.html However both alarms in CloudWatch just report: State Details: State changed to 'INSUFFICIENT_DATA' at 2012/11/12 16:30 UTC. Reason: Unchecked: Initial alarm creation Auto scaling is working fine but the alarms aren't picking up any CPUUtilization data at all. Any ideas for things I can try? Edit:

How to parse mixed text and JSON log entries in AWS CloudWatch for Log Metric Filter

↘锁芯ラ 提交于 2019-12-06 20:44:36
问题 I am trying to parse log entries which are mix of text and JSON. First line is text representation and next lines are JSON payload of the event. One of the possible examples are: 2016-07-24T21:08:07.888Z [INFO] Command completed lessonrecords-create { "key": "lessonrecords-create", "correlationId": "c1c07081-3f67-4ab3-a5e2-1b3a16c87961", "result": { "id": "9457ce88-4e6f-4084-bbea-14fff78ce5b6", "status": "NA", "private": false, "note": "Test note", "time": "2016-02-01T01:24:00.000Z",

A sane way to set up CloudWatch logs (awslogs-agent)

偶尔善良 提交于 2019-12-06 11:30:45
tl;dr The configuration of cloudwatch agent is #$%^. Any straightforward way? I wanted one place to store the logs, so I used Amazon CloudWatch Logs Agent. At first it seemed like I'd just add a Resource saying something like "create a log group, then a log stream and send this file, thank you" - all declarative and neat, but... According to this doc I had to setup JSON configuration that created a BASH script that downloaded a Python script that set up the service that used a generated config in yet-another-language somewhere else. I'd think logging is something frequently used, so there must

Stream logs to elastic using aws cli

☆樱花仙子☆ 提交于 2019-12-06 09:59:31
I would like to enable the Stream to Amazon Elasticsearch Service from Cloudwatch to Elasticsearch. I'm familiar with how to do that manually, I'm looking for a way to achieve that by running aws cli commands. assuming Elasticsearch is already configured, is there any way to automate the process ? Behind the scene Stream to Amazon Elasticsearch service create new lambda and then it pushes the log to Lambda then ELK. destination arn The Amazon Resource Name (ARN) of the Kinesis stream, Kinesis Data Firehose stream, or Lambda function you want to use as the destination of the subscription feed.

AWS CloudWatch to start/stop EC2 instances

痞子三分冷 提交于 2019-12-06 06:21:21
Just looking the way to start/stop a AWS EC2 instance in case of CPU utilization increase or decrease on another EC2 instacne. I know there is service available Auto Scaling in AWS but I have a scenario where I can't take advantage of this service. So just looking if it is possible or anyone can help me on this. Just detailing the concern like suppose I have 2 EC2 instance on AWS account by name EC21 and EC22. By default, EC22 instance is stopped. Now I need to setup CloudWatch or any other service to check if load/CPU utilization increase on EC21 instance by 70% then need to start EC22 server

How to restart EC2 instance from CloudWatch alarm

久未见 提交于 2019-12-06 04:21:30
Sometimes my application dies w/o any reason and I can detect that using CloudWatch and CPU usage metric going down. At this moment I want to restart the java application or the whole EC2 instance. Any suggestions how can I achive that? You can let CloudWatch terminate your EC2 instance and let AutoScaling bring up another "fresh" instance with your application configured. AWS CloudWatch now provides a reboot EC2 instance action . If only your application halt but EC2 instance works. You could write a shell monitor the app using CloudWatch API and shoot the app when necessary,then make it a

AWS CloudWatch: EndpointConnectionError: Could not connect to the endpoint URL

时光毁灭记忆、已成空白 提交于 2019-12-06 03:57:32
问题 I just followed these instructions (Link) to get AWS CloudWatch installed on my EC2 instance. I updated my repositories: sudo yum update -y I installed the awslogs package: sudo yum install -y awslogs I edited the /etc/awslogs/awscli.conf, confirming that my AZ is us-west-2b on the EC2 page I left the default condiguration of the /etc/awslogs/awslogs.conf file as is, confirming that the default path indeed has logs being written to it I checked the /var/log/awslogs.log file and it is

spark streaming throughput monitoring

杀马特。学长 韩版系。学妹 提交于 2019-12-06 02:51:22
问题 Is there a way to monitor the input and output throughput of a Spark cluster, to make sure the cluster is not flooded and overflowed by incoming data? In my case, I set up Spark cluster on AWS EC2, so I'm thinking of using AWS CloudWatch to monitor the NetworkIn and NetworkOut for each node in the cluster. But my idea seems to be not accurate and network does not meaning incoming data for Spark only, maybe also some other data would be calculated too. Is there a tool or way to monitor

How to debug failed fargate task initialization

微笑、不失礼 提交于 2019-12-06 00:54:40
问题 I have a fargate task which I have scheduled to run with CloudWatch Event rules, and output a timestamp to a database on a successful run. It also outputs a logfile to CloudWatch for every time it runs. However, there was 1 time where the log file was not created, and the database not updated. I suspect the task was never even started, or had failed to start. In CloudWatch, the event rule shows trigger and invocation at the time I expected the task to run, so I assume the task at least

AWS CloudWatch Alarm On Startup Of New EC2 Instance

耗尽温柔 提交于 2019-12-05 21:28:45
I want to apply a CloudWatch alarm to instances as they are being created. The alarm should send a message to an email account when CPU usage drops below 10% for 1 full day. I believe the best way of achieving this is by using a User data script at the time of launching the instance. Yes, you could use User Data to create the CloudWatch Alarm and notification . Start by creating an Amazon SNS topic for receiving notifications. Subscribe an email address to receive the notifications. This SNS topic can be used for all notifications, so only needs to be created once. Then, create a User Data