From ccd172fb6b8f145823660e27b2d36dcb9017149f Mon Sep 17 00:00:00 2001 From: Ala Raddaoui Date: Sun, 10 Feb 2019 22:21:50 -0800 Subject: [PATCH] DOC review and cleaning --- README.md | 14 ++++---------- scripts/validate.sh | 2 +- terraform/main.tf | 2 +- 3 files changed, 6 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index 5c0ef77..64f616b 100644 --- a/README.md +++ b/README.md @@ -115,19 +115,13 @@ The terraform configuration takes two parameters to determine where the Kubernet * project * zone -For simplicity, these parameters should be specified in a file named terraform.tfvars, in the terraform directory. To generate this file based on your glcoud defaults, run: - -./generate-tfvars.sh -This will generate a terraform/terraform.tfvars file with the following keys. The values themselves will match the output of gcloud config list: +For simplicity, these parameters should be specified in a file named terraform.tfvars, in the terraform directory. To generate this file based on your glcoud defaults, a script will be used `./scripts/generate-tfvars.sh` to produce a `terraform/terraform.tfvars` file with the following keys. The values themselves will match the output of gcloud config list: ``` # Contents of terraform.tfvars project="YOUR_PROJECT" zone="YOUR_ZONE" ``` -If you need to override any of the defaults, simply replace the desired value(s) to the right of the equals sign(s). Be sure your replacement values are still double-quoted. - - #### Deploying the cluster There are three Terraform files provided with this example. The first one, `main.tf`, is the starting point for Terraform. It describes the features that will be used, the resources that will be manipulated, and the outputs that will result. The second file is `provider.tf`, which indicates which cloud provider and version will be the target of the Terraform commands--in this case GCP. The final file is `variables.tf`, which contains a list of variables that are used as inputs into Terraform. Any variables referenced in the `main.tf` that do not have defaults configured in `variables.tf` will result in prompts to the user at runtime. @@ -161,7 +155,7 @@ Using the IP:Port value you can now access the application. Go to a browser and ### Logs in the Stackdriver UI -Stackdriver provides a UI for viewing log events. Basic search and filtering features are provided, which can be useful when debugging system issues. The Stackdriver Logging UI is best suited to exploring more recent log events. Users requiring longer-term storage of log events should consider some the tools in following sections. +Stackdriver provides a UI for viewing log events. Basic search and filtering features are provided, which can be useful when debugging system issues. The Stackdriver Logging UI is best suited to exploring more recent log events. Users requiring longer-term storage of log events should consider some of the tools in following sections. To access the Stackdriver Logging console perform the following steps: @@ -220,7 +214,7 @@ To access the Stackdriver logs in BigQuery perform the following steps: ![BigQuery](docs/bigquery.png) 5. Click on the **Query Table** towards the top right to perform a custom query against the table. -6. This opens the query window. You can simply add an asterisk (*) after the **Select** in the window to pull all details from the current table. **Note:**A 'Select *' query is generally very expensive and not advised. For this tutorial the dataset is limited to only the last hour of logs so the overall dataset is relatively small. +6. This opens the query window. You can simply add an asterisk (*) after the **Select** in the window to pull all details from the current table. **Note:** A 'Select *' query is generally very expensive and not advised. For this tutorial the dataset is limited to only the last hour of logs so the overall dataset is relatively small. 7. Click the **Run Query** button to execute the query and return some results from the table. 8. A popup window till ask you to confirm running the query. Click the **Run Query** button on this window as well. 9. The results window should display some rows and columns. You can scroll through the various rows of data that are returned, or download the results to a local file. @@ -239,7 +233,7 @@ Since Terraform tracks the resources it created it is able to tear them all down ### Next Steps -Having used Terraform to deploy an application to Kubernetes Engine, generated logs, and viewed them in Stackdriver, you might consider exploring [Stackdriver Monitoring](https://cloud.google.com/monitoring/) and [Stackdriver Tracing](https://cloud.google.com/trace/). Examples for these topics are available [here](../README.md) and build on the work performed with this document. +Having used Terraform to deploy an application to Kubernetes Engine, generated logs, and viewed them in Stackdriver, you might consider exploring [Stackdriver Monitoring](https://cloud.google.com/monitoring/) and [Stackdriver Tracing](https://cloud.google.com/trace/). Examples for these topics are available [here](https://github.com/GoogleCloudPlatform?q=gke-tracing-demo++OR+gke-monitoring-tutorial) and build on the work performed with this document. ## Troubleshooting diff --git a/scripts/validate.sh b/scripts/validate.sh index 49ea7b1..dde8765 100755 --- a/scripts/validate.sh +++ b/scripts/validate.sh @@ -59,7 +59,7 @@ EXT_IP="" for ((i=0; i < RETRY_COUNT ; i++)); do EXT_IP=$(kubectl get svc "$APP_NAME" -n default \ -ojsonpath='{.status.loadBalancer.ingress[0].ip}') - [ ! -z "$EXT_IP" ] && break + [ -n "$EXT_IP" ] && break sleep 2 done if [ -z "$EXT_IP" ] diff --git a/terraform/main.tf b/terraform/main.tf index 5ff1efc..b9f4a96 100644 --- a/terraform/main.tf +++ b/terraform/main.tf @@ -18,7 +18,7 @@ limitations under the License. // // This configuration will create a GKE cluster that will be used for creating // log information to be used by Stackdriver Logging. The configuration will -// also create the resources and Stackdriver Logging exports for Cloud Storage +// also create the kubernetes resources and Stackdriver Logging exports for Cloud Storage // and BigQuery. // ///////////////////////////////////////////////////////////////////////////////////////