Create a logStream for each log file in cloudwatchLogs
I use AWS CloudWatch
log agent to push my application log to AWS Cloudwatch.
In the cloudwatchLogs
config file inside my EC2 instance
, I have this entry:
[/scripts/application]
datetime_format = %Y-%m-%d %H:%M:%S
file = /workingdir/customer/logfiles/*.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = /scripts/application
According to this configuration, all log files in workingdir directory are being sent to cloudwatchLogs in the same stream were the name is the instance Id.
My question is, I want for each log file, create a separate logStream
, so that the logs reading can be more fast and parseable. In other words, every time I have a new log file, a new logstream is created automatically.
I thought of doing that by a shell script in a cron job but then I'll have to change many other configurations in the architecture, so I'm looking for a way to do it in the config file. In the documentation, they say that :
log_stream_name
Specifies the destination log stream. You can use a literal string or
predefined variables ({instance_id}, {hostname}, {ip_address}), or
combination of both to define a log stream name. A log stream is
created automatically if it doesn't already exist.
The names of the log files can't be 100% predictible, but they always have this structure though:
CustomerName-YYYY-mm-dd.log
Also, another problem is that :
A running agent must be stopped and restarted for configuration
changes to take effect.
How can I set the logStream in this case?
Any ideas or suggestions or workarounds are very appreciated.
amazon-web-services amazon-ec2 configuration stream amazon-cloudwatchlogs
add a comment |
I use AWS CloudWatch
log agent to push my application log to AWS Cloudwatch.
In the cloudwatchLogs
config file inside my EC2 instance
, I have this entry:
[/scripts/application]
datetime_format = %Y-%m-%d %H:%M:%S
file = /workingdir/customer/logfiles/*.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = /scripts/application
According to this configuration, all log files in workingdir directory are being sent to cloudwatchLogs in the same stream were the name is the instance Id.
My question is, I want for each log file, create a separate logStream
, so that the logs reading can be more fast and parseable. In other words, every time I have a new log file, a new logstream is created automatically.
I thought of doing that by a shell script in a cron job but then I'll have to change many other configurations in the architecture, so I'm looking for a way to do it in the config file. In the documentation, they say that :
log_stream_name
Specifies the destination log stream. You can use a literal string or
predefined variables ({instance_id}, {hostname}, {ip_address}), or
combination of both to define a log stream name. A log stream is
created automatically if it doesn't already exist.
The names of the log files can't be 100% predictible, but they always have this structure though:
CustomerName-YYYY-mm-dd.log
Also, another problem is that :
A running agent must be stopped and restarted for configuration
changes to take effect.
How can I set the logStream in this case?
Any ideas or suggestions or workarounds are very appreciated.
amazon-web-services amazon-ec2 configuration stream amazon-cloudwatchlogs
1
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05
add a comment |
I use AWS CloudWatch
log agent to push my application log to AWS Cloudwatch.
In the cloudwatchLogs
config file inside my EC2 instance
, I have this entry:
[/scripts/application]
datetime_format = %Y-%m-%d %H:%M:%S
file = /workingdir/customer/logfiles/*.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = /scripts/application
According to this configuration, all log files in workingdir directory are being sent to cloudwatchLogs in the same stream were the name is the instance Id.
My question is, I want for each log file, create a separate logStream
, so that the logs reading can be more fast and parseable. In other words, every time I have a new log file, a new logstream is created automatically.
I thought of doing that by a shell script in a cron job but then I'll have to change many other configurations in the architecture, so I'm looking for a way to do it in the config file. In the documentation, they say that :
log_stream_name
Specifies the destination log stream. You can use a literal string or
predefined variables ({instance_id}, {hostname}, {ip_address}), or
combination of both to define a log stream name. A log stream is
created automatically if it doesn't already exist.
The names of the log files can't be 100% predictible, but they always have this structure though:
CustomerName-YYYY-mm-dd.log
Also, another problem is that :
A running agent must be stopped and restarted for configuration
changes to take effect.
How can I set the logStream in this case?
Any ideas or suggestions or workarounds are very appreciated.
amazon-web-services amazon-ec2 configuration stream amazon-cloudwatchlogs
I use AWS CloudWatch
log agent to push my application log to AWS Cloudwatch.
In the cloudwatchLogs
config file inside my EC2 instance
, I have this entry:
[/scripts/application]
datetime_format = %Y-%m-%d %H:%M:%S
file = /workingdir/customer/logfiles/*.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = /scripts/application
According to this configuration, all log files in workingdir directory are being sent to cloudwatchLogs in the same stream were the name is the instance Id.
My question is, I want for each log file, create a separate logStream
, so that the logs reading can be more fast and parseable. In other words, every time I have a new log file, a new logstream is created automatically.
I thought of doing that by a shell script in a cron job but then I'll have to change many other configurations in the architecture, so I'm looking for a way to do it in the config file. In the documentation, they say that :
log_stream_name
Specifies the destination log stream. You can use a literal string or
predefined variables ({instance_id}, {hostname}, {ip_address}), or
combination of both to define a log stream name. A log stream is
created automatically if it doesn't already exist.
The names of the log files can't be 100% predictible, but they always have this structure though:
CustomerName-YYYY-mm-dd.log
Also, another problem is that :
A running agent must be stopped and restarted for configuration
changes to take effect.
How can I set the logStream in this case?
Any ideas or suggestions or workarounds are very appreciated.
amazon-web-services amazon-ec2 configuration stream amazon-cloudwatchlogs
amazon-web-services amazon-ec2 configuration stream amazon-cloudwatchlogs
edited Mar 21 '17 at 11:26
Souad
asked Mar 21 '17 at 10:40
SouadSouad
1,82753570
1,82753570
1
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05
add a comment |
1
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05
1
1
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42924765%2fcreate-a-logstream-for-each-log-file-in-cloudwatchlogs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42924765%2fcreate-a-logstream-for-each-log-file-in-cloudwatchlogs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
I know you asked this quite some time ago but did you find a solution in the end? I'm experiencing the same limitations in trying to configure the AWS logging agent to be more automated, avoiding the need to configure each log stream separately; we simply have too many log files for this to be feasible, so any input is greatly appreciated. Thanks!
– Kevin
Jan 17 '18 at 17:05