-
Notifications
You must be signed in to change notification settings - Fork 7
Conversation
README.md
Outdated
s3_logging_bucket_name = "${var.s3_logging_bucket_name}" | ||
es_kinesis_delivery_stream = "${var.es_kinesis_delivery_stream}" | ||
} | ||
```` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor: de-indent this block, use three backticks, and hcl
after the top three to get syntax highlighting.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
README.md
Outdated
|
||
...where the variables referenced above are defined in your terraform.tfvars file. | ||
|
||
Following the steps below will emulate this exact behavior. You must execute it from the test directory just below the terraform directory. The test consumes the stack as a module and deploys it, then sets up an EC2 instance that will install the aws-kinesis-agent and configure it to stream to the Kinesis Firehose delivery stream. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor: I'd split the README by Usage
and Development
(or Test
) headings, to more clearly delineate.
README.md
Outdated
|
||
```` | ||
module "ekk_stack" { | ||
source = "github.com/GSA/devsecops-ekk-stack" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would actually be github.com/GSA/devsecops-ekk-stack//terraform
- confirmed by adding to devsecops-example: GSA/devsecops-example#65. Alternatively, can stick to the standard module structure and put the reusable module in the root directory.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
ebs_options { | ||
ebs_enabled = "true" | ||
iops = "0" | ||
volume_size = "20" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Presumably we'd want this configurable, yeah?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to make a pass at parameterizing a lot of things in a separate PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a small thing. Feel free to merge after that!
@@ -6,12 +6,31 @@ This stack is based on [this CloudFormation example.](https://us-west-2.console. | |||
|
|||
The stack also creates a small EC2 instance (defined in ec2-test.tf) that will be configured with a kinesis agent to test writing into the stream. If you do not wish to deploy this instance, move this file out of the terraform directory or change the extension of the file. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move this part under the Test deployment
section.
|
||
...where the variables referenced above are defined in your terraform.tfvars file. "var.s3_logging_bucket_name" should be set to a bucket (which the stack will create) to contain copies of the kinesis firehose logs. "var.es_kinesis_delivery_stream" should be set to the name of the firehose delivery stream that you wish to use. The EKK stack will create this delivery stream with the name you provide with this variable. | ||
|
||
The Kinesis stream will send to Elasticsearch and S3. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'll probably want to add a bit about "you'll need to install a logging agent on your instances", or "here's how to forward logs", even if we just link elsewhere. Can take care of that separately though.
Closes #3