From d7f7f2052004a13ce09e9bbbc404e79aefcff6df Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 02:23:06 +0530 Subject: [PATCH 01/43] 1st iteration --- CONTRIBUTING.md | 297 +++++++++++++++++++++++++----------------------- 1 file changed, 155 insertions(+), 142 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 54ff60451b..ca6eef6f25 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,147 +1,13 @@ -# How to Contribute +## Welcome to SigNoz contributing section. Thank you for taking out the time to contribute to this project. -There are primarily 2 areas in which you can contribute in SigNoz +Sections: +1. [General Guidelines](#1-general-instructions) +2. [How to Contribute](#2-how-to-contribute) +3. [Develop Frontend](#3-develop-frontend) +4. [Contribute to Query-Service](#4-contribute-to-query-service) +5. [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart) -- Frontend ( written in Typescript, React) -- Backend - ( Query Service - written in Go) - -Depending upon your area of expertise & interest, you can chose one or more to contribute. Below are detailed instructions to contribute in each area - -> Please note: If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. πŸ™πŸ» - -> If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. - -# Develop Frontend - -Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend) - -### Contribute to Frontend with Docker installation of SigNoz - -- `git clone https://github.com/SigNoz/signoz.git && cd signoz` -- comment out frontend service section at `deploy/docker/clickhouse-setup/docker-compose.yaml#L62` -- run `cd deploy` to move to deploy directory -- Install signoz locally without the frontend - - Add below configuration to query-service section at `docker/clickhouse-setup/docker-compose.yaml#L38` - - ```docker - ports: - - "8080:8080" - ``` - - If you are using x86_64 processors (All Intel/AMD processors) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d` - - If you are on arm64 processors (Apple M1 Macbooks) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d` -- `cd ../frontend` and change baseURL to `http://localhost:8080` in file `src/constants/env.ts` -- `yarn install` -- `yarn dev` - -> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` - -### Contribute to Frontend without installing SigNoz backend - -If you don't want to install SigNoz backend just for doing frontend development, we can provide you with test environments which you can use as the backend. Please ping us in #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `` - -- `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend` -- Create a file `.env` with `FRONTEND_API_ENDPOINT=` -- `yarn install` -- `yarn dev` - -**_Frontend should now be accessible at `http://localhost:3301/application`_** - -# Contribute to Query-Service - -Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) - -### To run ClickHouse setup (recommended for local development) - -- git clone https://github.com/SigNoz/signoz.git -- run `cd signoz` to move to signoz directory -- run `sudo make dev-setup` to configure local setup to run query-service -- comment out frontend service section at `docker/clickhouse-setup/docker-compose.yaml` -- comment out query-service section at `docker/clickhouse-setup/docker-compose.yaml` -- add below configuration to clickhouse section at `docker/clickhouse-setup/docker-compose.yaml` -```docker - expose: - - 9000 - ports: - - 9001:9000 -``` - -- run `cd pkg/query-service/` to move to query-service directory -- Open ./constants/constants.go - - Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \ - with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` - -- Install signoz locally without the frontend and query-service - - If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86` - - If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm` - -#### Run locally -```console -ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go -``` - -> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` - -**_Query Service should now be available at `http://localhost:8080`_** - -> If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` - ---- - - -# Contribute to SigNoz Helm Chart - -Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts). - -### To run helm chart for local development - -- run `git clone https://github.com/SigNoz/charts.git` followed by `cd charts` -- it is recommended to use lightweight kubernetes (k8s) cluster for local development: - - [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation) - - [k3d](https://k3d.io/#installation) - - [minikube](https://minikube.sigs.k8s.io/docs/start/) -- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster -- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace. -- run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301) - -**To install HotROD sample app:** - -```bash -curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \ - | HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash -``` - -**To load data with HotROD sample app:** - -```bash -kubectl -n sample-application run strzal --image=djbingham/curl \ - --restart='OnFailure' -i --tty --rm --command -- curl -X POST -F \ - 'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm -``` - -**To stop the load generation:** - -```bash -kubectl -n sample-application run strzal --image=djbingham/curl \ - --restart='OnFailure' -i --tty --rm --command -- curl \ - http://locust-master:8089/stop -``` - -**To delete HotROD sample app:** - -```bash -curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \ - | HOTROD_NAMESPACE=sample-application bash -``` - ---- - -## General Instructions +# 1. General Instructions **Before making any significant changes, please open an issue**. Each issue should describe the following: @@ -188,3 +54,150 @@ e.g. If you are submitting a fix for an issue in frontend - PR name should be pr 2. Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows 3. Feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) + + +# 2. How to Contribute + +There are primarily 2 areas in which you can contribute in SigNoz + +- Frontend ( written in Typescript, React) +- Backend - ( Query Service - written in Go) + +Depending upon your area of expertise & interest, you can chose one or more to contribute. Below are detailed instructions to contribute in each area + +> Please note: If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. πŸ™πŸ» + +> If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. + +# 3. Develop Frontend + +Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend) + +### 3.1 Contribute to Frontend with Docker installation of SigNoz + +- `git clone https://github.com/SigNoz/signoz.git && cd signoz` +- comment out frontend service section at `deploy/docker/clickhouse-setup/docker-compose.yaml#L62` +- run `cd deploy` to move to deploy directory +- Install signoz locally without the frontend + - Add below configuration to query-service section at `docker/clickhouse-setup/docker-compose.yaml#L38` + + ```docker + ports: + - "8080:8080" + ``` + - If you are using x86_64 processors (All Intel/AMD processors) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d` + - If you are on arm64 processors (Apple M1 Macbooks) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d` +- `cd ../frontend` and change baseURL to `http://localhost:8080` in file `src/constants/env.ts` +- `yarn install` +- `yarn dev` + +> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` + +### 3.2 Contribute to Frontend without installing SigNoz backend + +If you don't want to install SigNoz backend just for doing frontend development, we can provide you with test environments which you can use as the backend. Please ping us in #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `` + +- `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend` +- Create a file `.env` with `FRONTEND_API_ENDPOINT=` +- `yarn install` +- `yarn dev` + +**_Frontend should now be accessible at `http://localhost:3301/application`_** + +# 4. Contribute to Query-Service + +Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) + +### 4.1 To run ClickHouse setup (recommended for local development) + +- git clone https://github.com/SigNoz/signoz.git +- run `cd signoz` to move to signoz directory +- run `sudo make dev-setup` to configure local setup to run query-service +- comment out frontend service section at `docker/clickhouse-setup/docker-compose.yaml` +- comment out query-service section at `docker/clickhouse-setup/docker-compose.yaml` +- add below configuration to clickhouse section at `docker/clickhouse-setup/docker-compose.yaml` +```docker + expose: + - 9000 + ports: + - 9001:9000 +``` + +- run `cd pkg/query-service/` to move to query-service directory +- Open ./constants/constants.go + - Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \ + with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` + +- Install signoz locally without the frontend and query-service + - If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86` + - If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm` + +#### 4.2 Run locally +```console +ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go +``` + +> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` + +**_Query Service should now be available at `http://localhost:8080`_** + +> If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` + +--- + + +# 5. Contribute to SigNoz Helm Chart + +Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts). + +### 5.1 To run helm chart for local development + +- run `git clone https://github.com/SigNoz/charts.git` followed by `cd charts` +- it is recommended to use lightweight kubernetes (k8s) cluster for local development: + - [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation) + - [k3d](https://k3d.io/#installation) + - [minikube](https://minikube.sigs.k8s.io/docs/start/) +- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster +- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace. +- run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301) + +**To install HotROD sample app:** + +```bash +curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \ + | HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash +``` + +**To load data with HotROD sample app:** + +```bash +kubectl -n sample-application run strzal --image=djbingham/curl \ + --restart='OnFailure' -i --tty --rm --command -- curl -X POST -F \ + 'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm +``` + +**To stop the load generation:** + +```bash +kubectl -n sample-application run strzal --image=djbingham/curl \ + --restart='OnFailure' -i --tty --rm --command -- curl \ + http://locust-master:8089/stop +``` + +**To delete HotROD sample app:** + +```bash +curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \ + | HOTROD_NAMESPACE=sample-application bash +``` + +--- + +Again, feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) + From 8477aebc8e16aa8885ff0188808c5d74446c5bfb Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 12:27:28 +0530 Subject: [PATCH 02/43] V2 --- CONTRIBUTING.md | 20 +++++++++++--------- 1 file changed, 11 insertions(+), 9 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index ca6eef6f25..b53a00b588 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,11 +1,13 @@ ## Welcome to SigNoz contributing section. Thank you for taking out the time to contribute to this project. -Sections: -1. [General Guidelines](#1-general-instructions) -2. [How to Contribute](#2-how-to-contribute) -3. [Develop Frontend](#3-develop-frontend) -4. [Contribute to Query-Service](#4-contribute-to-query-service) -5. [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart) +## Sections: +- [General Guidelines](#1-general-instructions) +- [How to Contribute](#2-how-to-contribute) +- [Develop Frontend](#3-develop-frontend) + - [Contribute to Frontend with Docker installation of SigNoz](#31-contribute-to-frontend-with-docker-installation-of-signoz) + - [Contribute to Frontend without installing SigNoz backend](#32-contribute-to-frontend-without-installing-signoz-backend) +- [Contribute to Query-Service](#4-contribute-to-query-service) +- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart) # 1. General Instructions @@ -36,7 +38,7 @@ and open a pull request(s). Unless your change is small, Please consider submitt stability and quality of the component. -You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [slack](https://signoz.io/slack). +You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack). - If you find any bugs, please create an issue - If you find anything missing in documentation, you can create an issue with label **documentation** @@ -132,14 +134,14 @@ Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](ht - If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86` - If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm` -#### 4.2 Run locally +#### Run locally ```console ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go ``` > Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` -**_Query Service should now be available at `http://localhost:8080`_** +**_Query Service should now be available at [http://localhost:8080](http://localhost:8080)_** > If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` From 54e09e1292a7e9a073a54b70a97315ed99ee7c6a Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 13:48:18 +0530 Subject: [PATCH 03/43] v3 --- CONTRIBUTING.md | 137 +++++++++++++++++++++++++++++------------------- 1 file changed, 84 insertions(+), 53 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index b53a00b588..8a3daaabcc 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,4 +1,10 @@ -## Welcome to SigNoz contributing section. Thank you for taking out the time to contribute to this project. +# Contributing Guidelines + +#### Welcome to SigNoz Contributing section πŸŽ‰ + +Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community. + +Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution. ## Sections: - [General Guidelines](#1-general-instructions) @@ -9,69 +15,87 @@ - [Contribute to Query-Service](#4-contribute-to-query-service) - [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart) -# 1. General Instructions +# 1. General Instructions πŸ“ -**Before making any significant changes, please open an issue**. Each issue -should describe the following: +Before making any significant changes and before filing an issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed), issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can. -* Requirement - what kind of use case are you trying to solve? -* Proposal - what do you suggest to solve the problem or improve the existing +#### Details like these are incredibly useful: + +- **Requirement** - what kind of use case are you trying to solve? +- **Proposal** - what do you suggest to solve the problem or improve the existing situation? -* Any open questions to address +- Any open questions to address❓ + +#### If you are reporting a bug, details like these are incredibly useful: + +- A reproducible test case or series of steps. +- The version of our code being used. +- Any modifications you've made relevant to the bug🐞. +- Anything unusual about your environment or deployment. + +
Discussing your proposed changes ahead of time will make the contribution -process smooth for everyone. Once the approach is agreed upon, make your changes -and open a pull request(s). Unless your change is small, Please consider submitting different PRs: +process smooth for everyone. -* First PR should include the overall structure of the new component: - * Readme, configuration, interfaces or base classes etc... +Once the approach is agreed uponβœ…, make your changes +and open a Pull Request(s). + +**Note:** Unless your change is small, **please** consider submitting different Pull Rrequest(s): + +* 1️⃣ First PR should include the overall structure of the new component: + * Readme, configuration, interfaces or base classes, etc... * This PR is usually trivial to review, so the size limit does not apply to it. -* Second PR should include the concrete implementation of the component. If the - size of this PR is larger than the recommended size consider splitting it in +* 2️⃣ Second PR should include the concrete implementation of the component. If the + size of this PR is larger than the recommended size, consider splitting it into multiple PRs. * If there are multiple sub-component then ideally each one should be implemented as a separate pull request. -* Last PR should include changes to any user facing documentation. And should include - end to end tests if applicable. The component must be enabled +* Last PR should include changes to any user-facing documentation. And should include + end-to-end tests if applicable. The component must be enabled only after sufficient testing, and there is enough confidence in the stability and quality of the component. You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack). -- If you find any bugs, please create an issue -- If you find anything missing in documentation, you can create an issue with label **documentation** -- If you want to build any new feature, please create an issue with label `enhancement` -- If you want to discuss something about the product, start a new [discussion](https://github.com/SigNoz/signoz/discussions) +- If you find any **bugs** β†’ please create an **issue.** +- If you find anything **missing** in documentation β†’ you can create an issue with the label **`documentation`**. +- If you want to build any **new feature** β†’ please create an issue with the label **`enhancement`**. +- If you want to **discuss** something about the product, start a new [**discussion**.](https://github.com/SigNoz/signoz/discussions) -### Conventions to follow when submitting commits, PRs +
-1. We try to follow https://www.conventionalcommits.org/en/v1.0.0/ +### Conventions to follow when submitting Commits and Pull Request(s). -More specifically the commits and PRs should have type specifiers prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea. +- We try to follow [Conventional Commits.](https://www.conventionalcommits.org/en/v1.0.0/) +, more specifically the commits and PRs should have type specifiers prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea. -e.g. If you are submitting a fix for an issue in frontend - PR name should be prefixed with `fix(FE):` +e.g. If you are submitting a fix for an issue in frontend, the PR name should be prefixed with **`fix(FE):`** -2. Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows +- Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows. -3. Feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) +- Feel free to ping us on [#contributing](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [#contributing-frontend](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :) +
-# 2. How to Contribute +# 2. How to Contribute πŸ™‹πŸ»β€β™‚οΈ -There are primarily 2 areas in which you can contribute in SigNoz +#### There are primarily 2 areas in which you can contribute to SigNoz -- Frontend ( written in Typescript, React) -- Backend - ( Query Service - written in Go) +- **Frontend** (Written in Typescript, React) +- **Backend** (Query Service, written in Go) -Depending upon your area of expertise & interest, you can chose one or more to contribute. Below are detailed instructions to contribute in each area +Depending upon your area of expertise & interest, you can choose one or more to contribute. Below are detailed instructions to contribute in each area. -> Please note: If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. πŸ™πŸ» +**Please note:** If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. πŸ™πŸ» -> If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. +⚠️ If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. -# 3. Develop Frontend +
+ +# 3. Develop Frontend 🌝 Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend) @@ -97,7 +121,7 @@ Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://git ### 3.2 Contribute to Frontend without installing SigNoz backend -If you don't want to install SigNoz backend just for doing frontend development, we can provide you with test environments which you can use as the backend. Please ping us in #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `` +If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend. Please ping us in the #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `` - `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend` - Create a file `.env` with `FRONTEND_API_ENDPOINT=` @@ -106,18 +130,23 @@ If you don't want to install SigNoz backend just for doing frontend development, **_Frontend should now be accessible at `http://localhost:3301/application`_** -# 4. Contribute to Query-Service +
-Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) +# 4. Contribute to Backend (Query-Service) πŸŒ• + +Need to update: [**https://github.com/SigNoz/signoz/tree/main/pkg/query-service**](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) ### 4.1 To run ClickHouse setup (recommended for local development) -- git clone https://github.com/SigNoz/signoz.git -- run `cd signoz` to move to signoz directory -- run `sudo make dev-setup` to configure local setup to run query-service -- comment out frontend service section at `docker/clickhouse-setup/docker-compose.yaml` -- comment out query-service section at `docker/clickhouse-setup/docker-compose.yaml` -- add below configuration to clickhouse section at `docker/clickhouse-setup/docker-compose.yaml` +- Clone SigNoz, +``` +git clone https://github.com/SigNoz/signoz.git +``` +- run `cd signoz` to move to signoz directory, +- run `sudo make dev-setup` to configure local setup to run query-service, +- comment out frontend service section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) +- comment out query-service section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) +- add below configuration to clickhouse section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) ```docker expose: - 9000 @@ -125,14 +154,14 @@ Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](ht - 9001:9000 ``` -- run `cd pkg/query-service/` to move to query-service directory -- Open ./constants/constants.go +- run `cd pkg/query-service/` to move to `query-service` directory, +- Open [`./constants/constants.go`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go) - Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \ - with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` + with β†’ ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` -- Install signoz locally without the frontend and query-service - - If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86` - - If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm` +- Now, install SigNoz locally **without** the `frontend` and `query-service`, + - If you are using `x86_64` processors (All Intel/AMD processors) run `sudo make run-x86` + - If you are on `arm64` processors (Apple M1 Macs) run `sudo make run-arm` #### Run locally ```console @@ -143,7 +172,7 @@ ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go **_Query Service should now be available at [http://localhost:8080](http://localhost:8080)_** -> If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` +> If you want to see how the frontend plays with query service, you can run the frontend also in your local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` --- +> To use it on your forked repo, edit the 'Open in Gitpod' button URL to `https://gitpod.io/#https://github.com//signoz` --> + +
# 5. Contribute to SigNoz Helm Chart @@ -169,14 +200,14 @@ Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/char - run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace. - run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301) -**To install HotROD sample app:** +**To install the HotROD sample app:** ```bash curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \ | HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash ``` -**To load data with HotROD sample app:** +**To load data with the HotROD sample app:** ```bash kubectl -n sample-application run strzal --image=djbingham/curl \ @@ -192,7 +223,7 @@ kubectl -n sample-application run strzal --image=djbingham/curl \ http://locust-master:8089/stop ``` -**To delete HotROD sample app:** +**To delete the HotROD sample app:** ```bash curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \ From a09a4c264e49694c331c08c3dd15cc2bf6dc9106 Mon Sep 17 00:00:00 2001 From: Pranshu Chittora Date: Wed, 13 Jul 2022 15:44:28 +0530 Subject: [PATCH 04/43] feat: change interval of PromQL queries (#1385) --- frontend/src/container/MetricsApplication/Tabs/DBCall.tsx | 2 +- frontend/src/container/MetricsApplication/Tabs/External.tsx | 2 +- frontend/src/container/MetricsApplication/Tabs/Overview.tsx | 4 ++-- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/frontend/src/container/MetricsApplication/Tabs/DBCall.tsx b/frontend/src/container/MetricsApplication/Tabs/DBCall.tsx index 60441b7876..2c14c099f7 100644 --- a/frontend/src/container/MetricsApplication/Tabs/DBCall.tsx +++ b/frontend/src/container/MetricsApplication/Tabs/DBCall.tsx @@ -25,7 +25,7 @@ function DBCall({ getWidget }: DBCallProps): JSX.Element { fullViewOptions={false} widget={getWidget([ { - query: `sum(rate(signoz_db_latency_count{service_name="${servicename}"${resourceAttributePromQLQuery}}[1m])) by (db_system)`, + query: `sum(rate(signoz_db_latency_count{service_name="${servicename}"${resourceAttributePromQLQuery}}[5m])) by (db_system)`, legend: '{{db_system}}', }, ])} diff --git a/frontend/src/container/MetricsApplication/Tabs/External.tsx b/frontend/src/container/MetricsApplication/Tabs/External.tsx index 9811e2f269..1d7e69831c 100644 --- a/frontend/src/container/MetricsApplication/Tabs/External.tsx +++ b/frontend/src/container/MetricsApplication/Tabs/External.tsx @@ -28,7 +28,7 @@ function External({ getWidget }: ExternalProps): JSX.Element { fullViewOptions={false} widget={getWidget([ { - query: `max((sum(rate(signoz_external_call_latency_count{service_name="${servicename}", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[1m]) OR rate(signoz_external_call_latency_count{service_name="${servicename}", http_status_code=~"5.."${resourceAttributePromQLQuery}}[1m]) OR vector(0)) by (http_url))*100/sum(rate(signoz_external_call_latency_count{service_name="${servicename}"${resourceAttributePromQLQuery}}[1m])) by (http_url)) < 1000 OR vector(0)`, + query: `max((sum(rate(signoz_external_call_latency_count{service_name="${servicename}", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_external_call_latency_count{service_name="${servicename}", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]) OR vector(0)) by (http_url))*100/sum(rate(signoz_external_call_latency_count{service_name="${servicename}"${resourceAttributePromQLQuery}}[5m])) by (http_url)) < 1000 OR vector(0)`, legend: 'External Call Error Percentage', }, ])} diff --git a/frontend/src/container/MetricsApplication/Tabs/Overview.tsx b/frontend/src/container/MetricsApplication/Tabs/Overview.tsx index a53714d05d..803ed91bcc 100644 --- a/frontend/src/container/MetricsApplication/Tabs/Overview.tsx +++ b/frontend/src/container/MetricsApplication/Tabs/Overview.tsx @@ -193,7 +193,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element { }} widget={getWidget([ { - query: `sum(rate(signoz_latency_count{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[2m]))`, + query: `sum(rate(signoz_latency_count{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))`, legend: 'Requests', }, ])} @@ -227,7 +227,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element { }} widget={getWidget([ { - query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[1m]) OR rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", http_status_code=~"5.."${resourceAttributePromQLQuery}}[1m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[1m]))) < 1000 OR vector(0)`, + query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))) < 1000 OR vector(0)`, legend: 'Error Percentage', }, ])} From a84754e8a81fa009315ffbf4d3914162325ec667 Mon Sep 17 00:00:00 2001 From: Vishal Sharma Date: Wed, 13 Jul 2022 15:55:43 +0530 Subject: [PATCH 05/43] perf: exception page optimization (#1287) * feat: update ListErrors API * feat: update error detail APIs and add a new API for fetching next prev error IDs * feat: update GetNextPrevErrorIDs API to handle an edge case * perf: use timestamp for fetching individual column * feat: add countErrors API --- .../app/clickhouseReader/options.go | 2 +- .../app/clickhouseReader/reader.go | 238 +++++++++++++++--- pkg/query-service/app/http_handler.go | 65 +++-- pkg/query-service/app/parser.go | 103 ++++++-- pkg/query-service/constants/constants.go | 4 + pkg/query-service/interfaces/interface.go | 9 +- pkg/query-service/model/queryParams.go | 17 +- pkg/query-service/model/response.go | 22 +- 8 files changed, 378 insertions(+), 82 deletions(-) diff --git a/pkg/query-service/app/clickhouseReader/options.go b/pkg/query-service/app/clickhouseReader/options.go index 30f23b5cf3..99fe5080ae 100644 --- a/pkg/query-service/app/clickhouseReader/options.go +++ b/pkg/query-service/app/clickhouseReader/options.go @@ -22,7 +22,7 @@ const ( defaultTraceDB string = "signoz_traces" defaultOperationsTable string = "signoz_operations" defaultIndexTable string = "signoz_index_v2" - defaultErrorTable string = "signoz_error_index" + defaultErrorTable string = "signoz_error_index_v2" defaulDurationTable string = "durationSortMV" defaultSpansTable string = "signoz_spans" defaultWriteBatchDelay time.Duration = 5 * time.Second diff --git a/pkg/query-service/app/clickhouseReader/reader.go b/pkg/query-service/app/clickhouseReader/reader.go index 596354433e..42182a8e81 100644 --- a/pkg/query-service/app/clickhouseReader/reader.go +++ b/pkg/query-service/app/clickhouseReader/reader.go @@ -4,7 +4,6 @@ import ( "bytes" "context" "crypto/md5" - "database/sql" "encoding/json" "flag" "fmt" @@ -60,7 +59,7 @@ const ( signozTraceDBName = "signoz_traces" signozDurationMVTable = "durationSort" signozSpansTable = "signoz_spans" - signozErrorIndexTable = "signoz_error_index" + signozErrorIndexTable = "signoz_error_index_v2" signozTraceTableName = "signoz_index_v2" signozMetricDBName = "signoz_metrics" signozSampleTableName = "samples_v2" @@ -2634,15 +2633,30 @@ func (r *ClickHouseReader) GetTTL(ctx context.Context, ttlParams *model.GetTTLPa } -func (r *ClickHouseReader) GetErrors(ctx context.Context, queryParams *model.GetErrorsParams) (*[]model.Error, *model.ApiError) { +func (r *ClickHouseReader) ListErrors(ctx context.Context, queryParams *model.ListErrorsParams) (*[]model.Error, *model.ApiError) { - var getErrorReponses []model.Error + var getErrorResponses []model.Error - query := fmt.Sprintf("SELECT exceptionType, exceptionMessage, count() AS exceptionCount, min(timestamp) as firstSeen, max(timestamp) as lastSeen, serviceName FROM %s.%s WHERE timestamp >= @timestampL AND timestamp <= @timestampU GROUP BY serviceName, exceptionType, exceptionMessage", r.traceDB, r.errorTable) + query := fmt.Sprintf("SELECT any(exceptionType) as exceptionType, any(exceptionMessage) as exceptionMessage, count() AS exceptionCount, min(timestamp) as firstSeen, max(timestamp) as lastSeen, any(serviceName) as serviceName, groupID FROM %s.%s WHERE timestamp >= @timestampL AND timestamp <= @timestampU GROUP BY groupID", r.traceDB, r.errorTable) args := []interface{}{clickhouse.Named("timestampL", strconv.FormatInt(queryParams.Start.UnixNano(), 10)), clickhouse.Named("timestampU", strconv.FormatInt(queryParams.End.UnixNano(), 10))} + if len(queryParams.OrderParam) != 0 { + if queryParams.Order == constants.Descending { + query = query + " ORDER BY " + queryParams.OrderParam + " DESC" + } else if queryParams.Order == constants.Ascending { + query = query + " ORDER BY " + queryParams.OrderParam + " ASC" + } + } + if queryParams.Limit > 0 { + query = query + " LIMIT @limit" + args = append(args, clickhouse.Named("limit", queryParams.Limit)) + } - err := r.db.Select(ctx, &getErrorReponses, query, args...) + if queryParams.Offset > 0 { + query = query + " OFFSET @offset" + args = append(args, clickhouse.Named("offset", queryParams.Offset)) + } + err := r.db.Select(ctx, &getErrorResponses, query, args...) zap.S().Info(query) if err != nil { @@ -2650,30 +2664,41 @@ func (r *ClickHouseReader) GetErrors(ctx context.Context, queryParams *model.Get return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} } - return &getErrorReponses, nil - + return &getErrorResponses, nil } -func (r *ClickHouseReader) GetErrorForId(ctx context.Context, queryParams *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) { +func (r *ClickHouseReader) CountErrors(ctx context.Context, queryParams *model.CountErrorsParams) (uint64, *model.ApiError) { + + var errorCount uint64 + + query := fmt.Sprintf("SELECT count(distinct(groupID)) FROM %s.%s WHERE timestamp >= @timestampL AND timestamp <= @timestampU", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("timestampL", strconv.FormatInt(queryParams.Start.UnixNano(), 10)), clickhouse.Named("timestampU", strconv.FormatInt(queryParams.End.UnixNano(), 10))} + + err := r.db.QueryRow(ctx, query, args...).Scan(&errorCount) + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return 0, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + + return errorCount, nil +} + +func (r *ClickHouseReader) GetErrorFromErrorID(ctx context.Context, queryParams *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) { if queryParams.ErrorID == "" { zap.S().Debug("errorId missing from params") - return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("ErrorID missing from params")} + return nil, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("ErrorID missing from params")} } var getErrorWithSpanReponse []model.ErrorWithSpan - // TODO: Optimize this query further - query := fmt.Sprintf("SELECT spanID, traceID, errorID, timestamp, serviceName, exceptionType, exceptionMessage, exceptionStacktrace, exceptionEscaped, olderErrorId, newerErrorId FROM (SELECT *, lagInFrame(toNullable(errorID)) over w as olderErrorId, leadInFrame(toNullable(errorID)) over w as newerErrorId FROM %s.%s window w as (ORDER BY exceptionType, serviceName, timestamp rows between unbounded preceding and unbounded following)) WHERE errorID = @errorID", r.traceDB, r.errorTable) - args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID)} + query := fmt.Sprintf("SELECT * FROM %s.%s WHERE timestamp = @timestamp AND groupID = @groupID AND errorID = @errorID LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} err := r.db.Select(ctx, &getErrorWithSpanReponse, query, args...) - zap.S().Info(query) - if err == sql.ErrNoRows { - return nil, nil - } - if err != nil { zap.S().Debug("Error in processing sql query: ", err) return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} @@ -2682,22 +2707,17 @@ func (r *ClickHouseReader) GetErrorForId(ctx context.Context, queryParams *model if len(getErrorWithSpanReponse) > 0 { return &getErrorWithSpanReponse[0], nil } else { - return &model.ErrorWithSpan{}, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("Error ID not found")} + return nil, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("Error/Exception not found")} } } -func (r *ClickHouseReader) GetErrorForType(ctx context.Context, queryParams *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) { +func (r *ClickHouseReader) GetErrorFromGroupID(ctx context.Context, queryParams *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) { - if queryParams.ErrorType == "" || queryParams.ServiceName == "" { - zap.S().Debug("errorType/serviceName missing from params") - return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("ErrorType/serviceName missing from params")} - } var getErrorWithSpanReponse []model.ErrorWithSpan - // TODO: Optimize this query further - query := fmt.Sprintf("SELECT spanID, traceID, errorID, timestamp , serviceName, exceptionType, exceptionMessage, exceptionStacktrace, exceptionEscaped, newerErrorId, olderErrorId FROM (SELECT *, lagInFrame(errorID) over w as olderErrorId, leadInFrame(errorID) over w as newerErrorId FROM %s.%s WHERE serviceName = @serviceName AND exceptionType = @errorType window w as (ORDER BY timestamp DESC rows between unbounded preceding and unbounded following))", r.traceDB, r.errorTable) - args := []interface{}{clickhouse.Named("serviceName", queryParams.ServiceName), clickhouse.Named("errorType", queryParams.ErrorType)} + query := fmt.Sprintf("SELECT * FROM %s.%s WHERE timestamp = @timestamp AND groupID = @groupID LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} err := r.db.Select(ctx, &getErrorWithSpanReponse, query, args...) @@ -2711,11 +2731,173 @@ func (r *ClickHouseReader) GetErrorForType(ctx context.Context, queryParams *mod if len(getErrorWithSpanReponse) > 0 { return &getErrorWithSpanReponse[0], nil } else { - return nil, &model.ApiError{Typ: model.ErrorUnavailable, Err: fmt.Errorf("Error/Exception not found")} + return nil, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("Error/Exception not found")} } } +func (r *ClickHouseReader) GetNextPrevErrorIDs(ctx context.Context, queryParams *model.GetErrorParams) (*model.NextPrevErrorIDs, *model.ApiError) { + + if queryParams.ErrorID == "" { + zap.S().Debug("errorId missing from params") + return nil, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("ErrorID missing from params")} + } + var err *model.ApiError + getNextPrevErrorIDsResponse := model.NextPrevErrorIDs{ + GroupID: queryParams.GroupID, + } + getNextPrevErrorIDsResponse.NextErrorID, getNextPrevErrorIDsResponse.NextTimestamp, err = r.getNextErrorID(ctx, queryParams) + if err != nil { + zap.S().Debug("Unable to get next error ID due to err: ", err) + return nil, err + } + getNextPrevErrorIDsResponse.PrevErrorID, getNextPrevErrorIDsResponse.PrevTimestamp, err = r.getPrevErrorID(ctx, queryParams) + if err != nil { + zap.S().Debug("Unable to get prev error ID due to err: ", err) + return nil, err + } + return &getNextPrevErrorIDsResponse, nil + +} + +func (r *ClickHouseReader) getNextErrorID(ctx context.Context, queryParams *model.GetErrorParams) (string, time.Time, *model.ApiError) { + + var getNextErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as nextErrorID, timestamp as nextTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp >= @timestamp AND errorID != @errorID ORDER BY timestamp ASC LIMIT 2", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getNextErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + if len(getNextErrorIDReponse) == 0 { + zap.S().Info("NextErrorID not found") + return "", time.Time{}, nil + } else if len(getNextErrorIDReponse) == 1 { + zap.S().Info("NextErrorID found") + return getNextErrorIDReponse[0].NextErrorID, getNextErrorIDReponse[0].NextTimestamp, nil + } else { + if getNextErrorIDReponse[0].Timestamp.UnixNano() == getNextErrorIDReponse[1].Timestamp.UnixNano() { + var getNextErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as nextErrorID, timestamp as nextTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp = @timestamp AND errorID > @errorID ORDER BY errorID ASC LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getNextErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + if len(getNextErrorIDReponse) == 0 { + var getNextErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as nextErrorID, timestamp as nextTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp > @timestamp ORDER BY timestamp ASC LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getNextErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + + if len(getNextErrorIDReponse) == 0 { + zap.S().Info("NextErrorID not found") + return "", time.Time{}, nil + } else { + zap.S().Info("NextErrorID found") + return getNextErrorIDReponse[0].NextErrorID, getNextErrorIDReponse[0].NextTimestamp, nil + } + } else { + zap.S().Info("NextErrorID found") + return getNextErrorIDReponse[0].NextErrorID, getNextErrorIDReponse[0].NextTimestamp, nil + } + } else { + zap.S().Info("NextErrorID found") + return getNextErrorIDReponse[0].NextErrorID, getNextErrorIDReponse[0].NextTimestamp, nil + } + } +} + +func (r *ClickHouseReader) getPrevErrorID(ctx context.Context, queryParams *model.GetErrorParams) (string, time.Time, *model.ApiError) { + + var getPrevErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as prevErrorID, timestamp as prevTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp <= @timestamp AND errorID != @errorID ORDER BY timestamp DESC LIMIT 2", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getPrevErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + if len(getPrevErrorIDReponse) == 0 { + zap.S().Info("PrevErrorID not found") + return "", time.Time{}, nil + } else if len(getPrevErrorIDReponse) == 1 { + zap.S().Info("PrevErrorID found") + return getPrevErrorIDReponse[0].PrevErrorID, getPrevErrorIDReponse[0].PrevTimestamp, nil + } else { + if getPrevErrorIDReponse[0].Timestamp.UnixNano() == getPrevErrorIDReponse[1].Timestamp.UnixNano() { + var getPrevErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as prevErrorID, timestamp as prevTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp = @timestamp AND errorID < @errorID ORDER BY errorID DESC LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getPrevErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + if len(getPrevErrorIDReponse) == 0 { + var getPrevErrorIDReponse []model.NextPrevErrorIDsDBResponse + + query := fmt.Sprintf("SELECT errorID as prevErrorID, timestamp as prevTimestamp FROM %s.%s WHERE groupID = @groupID AND timestamp < @timestamp ORDER BY timestamp DESC LIMIT 1", r.traceDB, r.errorTable) + args := []interface{}{clickhouse.Named("errorID", queryParams.ErrorID), clickhouse.Named("groupID", queryParams.GroupID), clickhouse.Named("timestamp", strconv.FormatInt(queryParams.Timestamp.UnixNano(), 10))} + + err := r.db.Select(ctx, &getPrevErrorIDReponse, query, args...) + + zap.S().Info(query) + + if err != nil { + zap.S().Debug("Error in processing sql query: ", err) + return "", time.Time{}, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} + } + + if len(getPrevErrorIDReponse) == 0 { + zap.S().Info("PrevErrorID not found") + return "", time.Time{}, nil + } else { + zap.S().Info("PrevErrorID found") + return getPrevErrorIDReponse[0].PrevErrorID, getPrevErrorIDReponse[0].PrevTimestamp, nil + } + } else { + zap.S().Info("PrevErrorID found") + return getPrevErrorIDReponse[0].PrevErrorID, getPrevErrorIDReponse[0].PrevTimestamp, nil + } + } else { + zap.S().Info("PrevErrorID found") + return getPrevErrorIDReponse[0].PrevErrorID, getPrevErrorIDReponse[0].PrevTimestamp, nil + } + } +} + func (r *ClickHouseReader) GetMetricAutocompleteTagKey(ctx context.Context, params *model.MetricAutocompleteTagParams) (*[]string, *model.ApiError) { var query string diff --git a/pkg/query-service/app/http_handler.go b/pkg/query-service/app/http_handler.go index 4e923af79c..6f5af546cd 100644 --- a/pkg/query-service/app/http_handler.go +++ b/pkg/query-service/app/http_handler.go @@ -327,11 +327,13 @@ func (aH *APIHandler) RegisterRoutes(router *mux.Router) { router.HandleFunc("/api/v1/getTagFilters", ViewAccess(aH.getTagFilters)).Methods(http.MethodPost) router.HandleFunc("/api/v1/getFilteredSpans", ViewAccess(aH.getFilteredSpans)).Methods(http.MethodPost) router.HandleFunc("/api/v1/getFilteredSpans/aggregates", ViewAccess(aH.getFilteredSpanAggregates)).Methods(http.MethodPost) - router.HandleFunc("/api/v1/getTagValues", ViewAccess(aH.getTagValues)).Methods(http.MethodPost) - router.HandleFunc("/api/v1/errors", ViewAccess(aH.getErrors)).Methods(http.MethodGet) - router.HandleFunc("/api/v1/errorWithId", ViewAccess(aH.getErrorForId)).Methods(http.MethodGet) - router.HandleFunc("/api/v1/errorWithType", ViewAccess(aH.getErrorForType)).Methods(http.MethodGet) + + router.HandleFunc("/api/v1/listErrors", ViewAccess(aH.listErrors)).Methods(http.MethodGet) + router.HandleFunc("/api/v1/countErrors", ViewAccess(aH.countErrors)).Methods(http.MethodGet) + router.HandleFunc("/api/v1/errorFromErrorID", ViewAccess(aH.getErrorFromErrorID)).Methods(http.MethodGet) + router.HandleFunc("/api/v1/errorFromGroupID", ViewAccess(aH.getErrorFromGroupID)).Methods(http.MethodGet) + router.HandleFunc("/api/v1/nextPrevErrorIDs", ViewAccess(aH.getNextPrevErrorIDs)).Methods(http.MethodGet) router.HandleFunc("/api/v1/disks", ViewAccess(aH.getDisks)).Methods(http.MethodGet) @@ -1177,49 +1179,78 @@ func (aH *APIHandler) searchTraces(w http.ResponseWriter, r *http.Request) { } -func (aH *APIHandler) getErrors(w http.ResponseWriter, r *http.Request) { +func (aH *APIHandler) listErrors(w http.ResponseWriter, r *http.Request) { - query, err := parseErrorsRequest(r) + query, err := parseListErrorsRequest(r) if aH.handleError(w, err, http.StatusBadRequest) { return } - result, apiErr := (*aH.reader).GetErrors(r.Context(), query) + result, apiErr := (*aH.reader).ListErrors(r.Context(), query) if apiErr != nil && aH.handleError(w, apiErr.Err, http.StatusInternalServerError) { return } aH.writeJSON(w, r, result) - } -func (aH *APIHandler) getErrorForId(w http.ResponseWriter, r *http.Request) { +func (aH *APIHandler) countErrors(w http.ResponseWriter, r *http.Request) { - query, err := parseErrorRequest(r) + query, err := parseCountErrorsRequest(r) if aH.handleError(w, err, http.StatusBadRequest) { return } - result, apiErr := (*aH.reader).GetErrorForId(r.Context(), query) - if apiErr != nil && aH.handleError(w, apiErr.Err, http.StatusInternalServerError) { + result, apiErr := (*aH.reader).CountErrors(r.Context(), query) + if apiErr != nil { + respondError(w, apiErr, nil) return } aH.writeJSON(w, r, result) - } -func (aH *APIHandler) getErrorForType(w http.ResponseWriter, r *http.Request) { +func (aH *APIHandler) getErrorFromErrorID(w http.ResponseWriter, r *http.Request) { - query, err := parseErrorRequest(r) + query, err := parseGetErrorRequest(r) if aH.handleError(w, err, http.StatusBadRequest) { return } - result, apiErr := (*aH.reader).GetErrorForType(r.Context(), query) - if apiErr != nil && aH.handleError(w, apiErr.Err, http.StatusInternalServerError) { + result, apiErr := (*aH.reader).GetErrorFromErrorID(r.Context(), query) + if apiErr != nil { + respondError(w, apiErr, nil) return } aH.writeJSON(w, r, result) +} +func (aH *APIHandler) getNextPrevErrorIDs(w http.ResponseWriter, r *http.Request) { + + query, err := parseGetErrorRequest(r) + if aH.handleError(w, err, http.StatusBadRequest) { + return + } + result, apiErr := (*aH.reader).GetNextPrevErrorIDs(r.Context(), query) + if apiErr != nil { + respondError(w, apiErr, nil) + return + } + + aH.writeJSON(w, r, result) +} + +func (aH *APIHandler) getErrorFromGroupID(w http.ResponseWriter, r *http.Request) { + + query, err := parseGetErrorRequest(r) + if aH.handleError(w, err, http.StatusBadRequest) { + return + } + result, apiErr := (*aH.reader).GetErrorFromGroupID(r.Context(), query) + if apiErr != nil { + respondError(w, apiErr, nil) + return + } + + aH.writeJSON(w, r, result) } func (aH *APIHandler) getSpanFilters(w http.ResponseWriter, r *http.Request) { diff --git a/pkg/query-service/app/parser.go b/pkg/query-service/app/parser.go index 9d3705da9f..e81b986a3d 100644 --- a/pkg/query-service/app/parser.go +++ b/pkg/query-service/app/parser.go @@ -360,28 +360,6 @@ func parseFilteredSpanAggregatesRequest(r *http.Request) (*model.GetFilteredSpan return postData, nil } -func parseErrorRequest(r *http.Request) (*model.GetErrorParams, error) { - - params := &model.GetErrorParams{} - - serviceName := r.URL.Query().Get("serviceName") - if len(serviceName) != 0 { - params.ServiceName = serviceName - } - - errorType := r.URL.Query().Get("errorType") - if len(errorType) != 0 { - params.ErrorType = errorType - } - - errorId := r.URL.Query().Get("errorId") - if len(errorId) != 0 { - params.ErrorID = errorId - } - - return params, nil -} - func parseTagFilterRequest(r *http.Request) (*model.TagFilterParams, error) { var postData *model.TagFilterParams err := json.NewDecoder(r.Body).Decode(&postData) @@ -427,7 +405,10 @@ func parseTagValueRequest(r *http.Request) (*model.TagFilterParams, error) { } -func parseErrorsRequest(r *http.Request) (*model.GetErrorsParams, error) { +func parseListErrorsRequest(r *http.Request) (*model.ListErrorsParams, error) { + + var allowedOrderParams = []string{"exceptionType", "exceptionCount", "firstSeen", "lastSeen", "serviceName"} + var allowedOrderDirections = []string{"ascending", "descending"} startTime, err := parseTime("start", r) if err != nil { @@ -438,9 +419,79 @@ func parseErrorsRequest(r *http.Request) (*model.GetErrorsParams, error) { return nil, err } - params := &model.GetErrorsParams{ - Start: startTime, - End: endTime, + order := r.URL.Query().Get("order") + if len(order) > 0 && !DoesExistInSlice(order, allowedOrderDirections) { + return nil, errors.New(fmt.Sprintf("given order: %s is not allowed in query", order)) + } + orderParam := r.URL.Query().Get("orderParam") + if len(order) > 0 && !DoesExistInSlice(orderParam, allowedOrderParams) { + return nil, errors.New(fmt.Sprintf("given orderParam: %s is not allowed in query", orderParam)) + } + limit := r.URL.Query().Get("limit") + offset := r.URL.Query().Get("offset") + + if len(offset) == 0 || len(limit) == 0 { + return nil, fmt.Errorf("offset or limit param cannot be empty from the query") + } + + limitInt, err := strconv.Atoi(limit) + if err != nil { + return nil, errors.New("limit param is not in correct format") + } + offsetInt, err := strconv.Atoi(offset) + if err != nil { + return nil, errors.New("offset param is not in correct format") + } + + params := &model.ListErrorsParams{ + Start: startTime, + End: endTime, + OrderParam: orderParam, + Order: order, + Limit: int64(limitInt), + Offset: int64(offsetInt), + } + + return params, nil +} + +func parseCountErrorsRequest(r *http.Request) (*model.CountErrorsParams, error) { + + startTime, err := parseTime("start", r) + if err != nil { + return nil, err + } + endTime, err := parseTimeMinusBuffer("end", r) + if err != nil { + return nil, err + } + + params := &model.CountErrorsParams{ + Start: startTime, + End: endTime, + } + + return params, nil +} + +func parseGetErrorRequest(r *http.Request) (*model.GetErrorParams, error) { + + timestamp, err := parseTime("timestamp", r) + if err != nil { + return nil, err + } + + groupID := r.URL.Query().Get("groupID") + + if len(groupID) == 0 { + return nil, fmt.Errorf("groupID param cannot be empty from the query") + } + errorID := r.URL.Query().Get("errorID") + + params := &model.GetErrorParams{ + Timestamp: timestamp, + GroupID: groupID, + ErrorID: errorID, } return params, nil diff --git a/pkg/query-service/constants/constants.go b/pkg/query-service/constants/constants.go index b4bc4b08ef..f74a63c7ff 100644 --- a/pkg/query-service/constants/constants.go +++ b/pkg/query-service/constants/constants.go @@ -61,6 +61,10 @@ const ( StatusPending = "pending" StatusFailed = "failed" StatusSuccess = "success" + ExceptionType = "exceptionType" + ExceptionCount = "exceptionCount" + LastSeen = "lastSeen" + FirstSeen = "firstSeen" ) const ( SIGNOZ_METRIC_DBNAME = "signoz_metrics" diff --git a/pkg/query-service/interfaces/interface.go b/pkg/query-service/interfaces/interface.go index 9c52a4497d..705a77c6a3 100644 --- a/pkg/query-service/interfaces/interface.go +++ b/pkg/query-service/interfaces/interface.go @@ -41,9 +41,12 @@ type Reader interface { GetFilteredSpans(ctx context.Context, query *model.GetFilteredSpansParams) (*model.GetFilterSpansResponse, *model.ApiError) GetFilteredSpansAggregates(ctx context.Context, query *model.GetFilteredSpanAggregatesParams) (*model.GetFilteredSpansAggregatesResponse, *model.ApiError) - GetErrors(ctx context.Context, params *model.GetErrorsParams) (*[]model.Error, *model.ApiError) - GetErrorForId(ctx context.Context, params *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) - GetErrorForType(ctx context.Context, params *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) + ListErrors(ctx context.Context, params *model.ListErrorsParams) (*[]model.Error, *model.ApiError) + CountErrors(ctx context.Context, params *model.CountErrorsParams) (uint64, *model.ApiError) + GetErrorFromErrorID(ctx context.Context, params *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) + GetErrorFromGroupID(ctx context.Context, params *model.GetErrorParams) (*model.ErrorWithSpan, *model.ApiError) + GetNextPrevErrorIDs(ctx context.Context, params *model.GetErrorParams) (*model.NextPrevErrorIDs, *model.ApiError) + // Search Interfaces SearchTraces(ctx context.Context, traceID string) (*[]model.SearchSpansResult, error) diff --git a/pkg/query-service/model/queryParams.go b/pkg/query-service/model/queryParams.go index 69509849d2..813b62d17f 100644 --- a/pkg/query-service/model/queryParams.go +++ b/pkg/query-service/model/queryParams.go @@ -282,15 +282,24 @@ type GetTTLParams struct { Type string } -type GetErrorsParams struct { +type ListErrorsParams struct { + Start *time.Time + End *time.Time + Limit int64 + OrderParam string + Order string + Offset int64 +} + +type CountErrorsParams struct { Start *time.Time End *time.Time } type GetErrorParams struct { - ErrorType string - ErrorID string - ServiceName string + GroupID string + ErrorID string + Timestamp *time.Time } type FilterItem struct { diff --git a/pkg/query-service/model/response.go b/pkg/query-service/model/response.go index 523ad7e96e..8c9dfad572 100644 --- a/pkg/query-service/model/response.go +++ b/pkg/query-service/model/response.go @@ -341,20 +341,36 @@ type Error struct { LastSeen time.Time `json:"lastSeen" ch:"lastSeen"` FirstSeen time.Time `json:"firstSeen" ch:"firstSeen"` ServiceName string `json:"serviceName" ch:"serviceName"` + GroupID string `json:"groupID" ch:"groupID"` } type ErrorWithSpan struct { ErrorID string `json:"errorId" ch:"errorID"` ExceptionType string `json:"exceptionType" ch:"exceptionType"` ExceptionStacktrace string `json:"exceptionStacktrace" ch:"exceptionStacktrace"` - ExceptionEscaped string `json:"exceptionEscaped" ch:"exceptionEscaped"` + ExceptionEscaped bool `json:"exceptionEscaped" ch:"exceptionEscaped"` ExceptionMsg string `json:"exceptionMessage" ch:"exceptionMessage"` Timestamp time.Time `json:"timestamp" ch:"timestamp"` SpanID string `json:"spanID" ch:"spanID"` TraceID string `json:"traceID" ch:"traceID"` ServiceName string `json:"serviceName" ch:"serviceName"` - NewerErrorID string `json:"newerErrorId" ch:"newerErrorId"` - OlderErrorID string `json:"olderErrorId" ch:"olderErrorId"` + GroupID string `json:"groupID" ch:"groupID"` +} + +type NextPrevErrorIDsDBResponse struct { + NextErrorID string `ch:"nextErrorID"` + NextTimestamp time.Time `ch:"nextTimestamp"` + PrevErrorID string `ch:"prevErrorID"` + PrevTimestamp time.Time `ch:"prevTimestamp"` + Timestamp time.Time `ch:"timestamp"` +} + +type NextPrevErrorIDs struct { + NextErrorID string `json:"nextErrorID"` + NextTimestamp time.Time `json:"nextTimestamp"` + PrevErrorID string `json:"prevErrorID"` + PrevTimestamp time.Time `json:"prevTimestamp"` + GroupID string `json:"groupID"` } type Series struct { From 0b08c8003851bbf693b2dd6f05a3bc355214f263 Mon Sep 17 00:00:00 2001 From: Pranshu Chittora Date: Wed, 13 Jul 2022 15:59:22 +0530 Subject: [PATCH 06/43] chore: tests for span to trace tree with missing spans support (#1368) * chore: tests for span to trace tree with missing spans support --- .../container/GantChart/SpanName/index.tsx | 2 +- .../container/GantChart/SpanName/styles.ts | 2 +- .../src/container/GantChart/Trace/styles.ts | 3 +- .../container/TraceDetail/Missingtrace.tsx | 5 +- frontend/src/types/api/trace/getTraceItem.ts | 8 +- .../__snapshots__/spanToTree.test.ts.snap | 211 ++++++++++++++++++ .../src/utils/__tests__/spanToTree.test.ts | 21 ++ frontend/src/utils/fixtures/TraceData.ts | 52 +++++ frontend/src/utils/spanToTree.ts | 6 + 9 files changed, 300 insertions(+), 10 deletions(-) create mode 100644 frontend/src/utils/__tests__/__snapshots__/spanToTree.test.ts.snap create mode 100644 frontend/src/utils/__tests__/spanToTree.test.ts create mode 100644 frontend/src/utils/fixtures/TraceData.ts diff --git a/frontend/src/container/GantChart/SpanName/index.tsx b/frontend/src/container/GantChart/SpanName/index.tsx index 47d58c3e5c..7f536624b9 100644 --- a/frontend/src/container/GantChart/SpanName/index.tsx +++ b/frontend/src/container/GantChart/SpanName/index.tsx @@ -10,7 +10,7 @@ function SpanNameComponent({ {name} - {serviceName} + {serviceName} ); diff --git a/frontend/src/container/GantChart/SpanName/styles.ts b/frontend/src/container/GantChart/SpanName/styles.ts index 642e28f639..abd41dc54e 100644 --- a/frontend/src/container/GantChart/SpanName/styles.ts +++ b/frontend/src/container/GantChart/SpanName/styles.ts @@ -9,7 +9,7 @@ export const Span = styled(Typography.Paragraph)` } `; -export const Service = styled(Typography)` +export const Service = styled(Typography.Paragraph)` &&& { color: #acacac; font-size: 0.75rem; diff --git a/frontend/src/container/GantChart/Trace/styles.ts b/frontend/src/container/GantChart/Trace/styles.ts index 7710e77b5b..a85eec454c 100644 --- a/frontend/src/container/GantChart/Trace/styles.ts +++ b/frontend/src/container/GantChart/Trace/styles.ts @@ -41,8 +41,9 @@ export const CardContainer = styled.li<{ isMissing?: boolean }>` width: 100%; cursor: pointer; border-radius: 0.25rem; + z-index: 2; ${({ isMissing }): string => - isMissing ? `border: 1px dashed ${volcano[6]};` : ''} + isMissing ? `border: 1px dashed ${volcano[6]} !important;` : ''} `; interface Props { diff --git a/frontend/src/container/TraceDetail/Missingtrace.tsx b/frontend/src/container/TraceDetail/Missingtrace.tsx index b203f05f68..eb0620a4ed 100644 --- a/frontend/src/container/TraceDetail/Missingtrace.tsx +++ b/frontend/src/container/TraceDetail/Missingtrace.tsx @@ -28,11 +28,10 @@ function MissingSpansMessage(): JSX.Element { justifyContent: 'center', alignItems: 'center', margin: '1rem 0', + fontSize: '0.8rem', }} > - {' '} + {' '} This trace has missing spans diff --git a/frontend/src/types/api/trace/getTraceItem.ts b/frontend/src/types/api/trace/getTraceItem.ts index a653823821..4b12d15b2f 100644 --- a/frontend/src/types/api/trace/getTraceItem.ts +++ b/frontend/src/types/api/trace/getTraceItem.ts @@ -18,10 +18,10 @@ export type Span = [ string, string, string, - string | string[], - string | string[], - string | string[], - Record[], + string[], + string[], + string[], + string[], boolean, ]; diff --git a/frontend/src/utils/__tests__/__snapshots__/spanToTree.test.ts.snap b/frontend/src/utils/__tests__/__snapshots__/spanToTree.test.ts.snap new file mode 100644 index 0000000000..2c2ab402e2 --- /dev/null +++ b/frontend/src/utils/__tests__/__snapshots__/spanToTree.test.ts.snap @@ -0,0 +1,211 @@ +// Jest Snapshot v1, https://goo.gl/fbAQLP + +exports[`utils/spanToTree should return a single tree on valid trace data 1`] = ` +Object { + "missingSpanTree": Array [], + "spanTree": Array [ + Object { + "children": Array [ + Object { + "children": Array [ + Object { + "children": Array [], + "event": Array [ + Object { + "attributeMap": Object { + "event": "HTTP request received S3", + "level": "info", + "method": "GET", + "url": "/dispatch?customer=392&nonse=0.015296363321630757", + }, + "timeUnixNano": 1657275433246142000, + }, + ], + "hasError": false, + "id": "span_3", + "isProcessed": true, + "name": "HTTP GET SPAN 3", + "references": Array [ + Object { + "RefType": "CHILD_OF", + "SpanId": "span_2", + "TraceId": "0000000000000000span_1", + }, + ], + "serviceColour": "", + "serviceName": "frontend", + "startTime": 1657275433246, + "tags": Array [ + Object { + "key": "host.name.span3", + "value": "span_3", + }, + ], + "time": 683273000, + "value": 683273000, + }, + ], + "event": Array [ + Object { + "attributeMap": Object { + "event": "HTTP request received S2", + "level": "info", + "method": "GET", + "url": "/dispatch?customer=392&nonse=0.015296363321630757", + }, + "timeUnixNano": 1657275433246142000, + }, + ], + "hasError": false, + "id": "span_2", + "isProcessed": true, + "name": "HTTP GET SPAN 2", + "references": Array [ + Object { + "RefType": "CHILD_OF", + "SpanId": "span_1", + "TraceId": "0000000000000000span_1", + }, + ], + "serviceColour": "", + "serviceName": "frontend", + "startTime": 1657275433246, + "tags": Array [ + Object { + "key": "host.name.span2", + "value": "span_2", + }, + ], + "time": 683273000, + "value": 683273000, + }, + ], + "event": Array [ + Object { + "attributeMap": Object { + "event": "HTTP request received S1", + "level": "info", + "method": "GET", + "url": "/dispatch?customer=392&nonse=0.015296363321630757", + }, + "timeUnixNano": 1657275433246142000, + }, + ], + "hasError": false, + "id": "span_1", + "name": "HTTP GET SPAN 1", + "references": Array [ + Object { + "RefType": "CHILD_OF", + "SpanId": "", + "TraceId": "0000000000000000span_1", + }, + ], + "serviceColour": "", + "serviceName": "frontend", + "startTime": 1657275433246, + "tags": Array [ + Object { + "key": "host.name.span1", + "value": "span_1", + }, + ], + "time": 683273000, + "value": 683273000, + }, + ], +} +`; + +exports[`utils/spanToTree should return a single tree on valid trace data 2`] = ` +Object { + "missingSpanTree": Array [ + Object { + "children": Array [ + Object { + "children": Array [], + "event": Array [ + Object { + "attributeMap": Object { + "event": "HTTP request received S3", + "level": "info", + "method": "GET", + "url": "/dispatch?customer=392&nonse=0.015296363321630757", + }, + "timeUnixNano": 1657275433246142000, + }, + ], + "hasError": false, + "id": "span_3", + "isProcessed": true, + "name": "HTTP GET SPAN 3", + "references": Array [ + Object { + "RefType": "CHILD_OF", + "SpanId": "span_2", + "TraceId": "0000000000000000span_1", + }, + ], + "serviceColour": "", + "serviceName": "frontend", + "startTime": 1657275433246, + "tags": Array [ + Object { + "key": "host.name.span3", + "value": "span_3", + }, + ], + "time": 683273000, + "value": 683273000, + }, + ], + "id": "span_2", + "isMissing": true, + "name": "Missing Span (span_2)", + "serviceColour": "", + "serviceName": "", + "startTime": null, + "tags": Array [], + "time": null, + "value": null, + }, + ], + "spanTree": Array [ + Object { + "children": Array [], + "event": Array [ + Object { + "attributeMap": Object { + "event": "HTTP request received S1", + "level": "info", + "method": "GET", + "url": "/dispatch?customer=392&nonse=0.015296363321630757", + }, + "timeUnixNano": 1657275433246142000, + }, + ], + "hasError": false, + "id": "span_1", + "name": "HTTP GET SPAN 1", + "references": Array [ + Object { + "RefType": "CHILD_OF", + "SpanId": "", + "TraceId": "0000000000000000span_1", + }, + ], + "serviceColour": "", + "serviceName": "frontend", + "startTime": 1657275433246, + "tags": Array [ + Object { + "key": "host.name.span1", + "value": "span_1", + }, + ], + "time": 683273000, + "value": 683273000, + }, + ], +} +`; diff --git a/frontend/src/utils/__tests__/spanToTree.test.ts b/frontend/src/utils/__tests__/spanToTree.test.ts new file mode 100644 index 0000000000..4cf7a20fb4 --- /dev/null +++ b/frontend/src/utils/__tests__/spanToTree.test.ts @@ -0,0 +1,21 @@ +import { TraceData } from '../fixtures/TraceData'; +import { spanToTreeUtil } from '../spanToTree'; + +describe('utils/spanToTree', () => { + test('should return a single tree on valid trace data', () => { + const spanTree = spanToTreeUtil(TraceData); + expect(spanTree.spanTree.length).toBe(1); + expect(spanTree.missingSpanTree.length).toBe(0); + expect(spanTree).toMatchSnapshot(); + }); + test('should return a single tree on valid trace data', () => { + const MissingTraceData = [...TraceData]; + MissingTraceData.splice(1, 1); + + const spanTree = spanToTreeUtil(MissingTraceData); + + expect(spanTree.spanTree.length).toBe(1); + expect(spanTree.missingSpanTree.length).toBe(1); + expect(spanTree).toMatchSnapshot(); + }); +}); diff --git a/frontend/src/utils/fixtures/TraceData.ts b/frontend/src/utils/fixtures/TraceData.ts new file mode 100644 index 0000000000..289e91e949 --- /dev/null +++ b/frontend/src/utils/fixtures/TraceData.ts @@ -0,0 +1,52 @@ +import { Span } from 'types/api/trace/getTraceItem'; + +export const TraceData: Span[] = [ + [ + 1657275433246, + 'span_1', + '0000000000000000span_1', + 'frontend', + 'HTTP GET SPAN 1', + '2', + '683273000', + ['host.name.span1'], + ['span_1'], + ['{TraceId=0000000000000000span_1, SpanId=, RefType=CHILD_OF}'], + [ + '{"timeUnixNano":1657275433246142000,"attributeMap":{"event":"HTTP request received S1","level":"info","method":"GET","url":"/dispatch?customer=392\\u0026nonse=0.015296363321630757"}}', + ], + false, + ], + [ + 1657275433246, + 'span_2', + '0000000000000000span_1', + 'frontend', + 'HTTP GET SPAN 2', + '2', + '683273000', + ['host.name.span2'], + ['span_2'], + ['{TraceId=0000000000000000span_1, SpanId=span_1, RefType=CHILD_OF}'], + [ + '{"timeUnixNano":1657275433246142000,"attributeMap":{"event":"HTTP request received S2","level":"info","method":"GET","url":"/dispatch?customer=392\\u0026nonse=0.015296363321630757"}}', + ], + false, + ], + [ + 1657275433246, + 'span_3', + '0000000000000000span_1', + 'frontend', + 'HTTP GET SPAN 3', + '2', + '683273000', + ['host.name.span3'], + ['span_3'], + ['{TraceId=0000000000000000span_1, SpanId=span_2, RefType=CHILD_OF}'], + [ + '{"timeUnixNano":1657275433246142000,"attributeMap":{"event":"HTTP request received S3","level":"info","method":"GET","url":"/dispatch?customer=392\\u0026nonse=0.015296363321630757"}}', + ], + false, + ], +]; diff --git a/frontend/src/utils/spanToTree.ts b/frontend/src/utils/spanToTree.ts index 115c4c111a..142df3dec8 100644 --- a/frontend/src/utils/spanToTree.ts +++ b/frontend/src/utils/spanToTree.ts @@ -109,6 +109,12 @@ export const spanToTreeUtil = (inputSpanList: Span[]): ITraceForest => { const missingSpanTree: ITraceTree[] = []; const referencedTraceIds: string[] = Array.from(traceIdSet); Object.keys(spanMap).forEach((spanId) => { + const isRoot = spanMap[spanId].references?.some((refs) => refs.SpanId === ''); + if (isRoot) { + spanTree.push(spanMap[spanId]); + return; + } + for (const traceId of referencedTraceIds) { if (traceId.includes(spanId)) { spanTree.push(spanMap[spanId]); From 4d1516e3fc91def0314ec1c9924d363ed579946f Mon Sep 17 00:00:00 2001 From: Palash Date: Wed, 13 Jul 2022 16:08:46 +0530 Subject: [PATCH 07/43] chore: removed stale make commands (#1340) Co-authored-by: Prashant Shahi --- Makefile | 6 ------ 1 file changed, 6 deletions(-) diff --git a/Makefile b/Makefile index ac93167fa7..7aaa3a41d6 100644 --- a/Makefile +++ b/Makefile @@ -82,15 +82,9 @@ dev-setup: run-x86: @docker-compose -f $(STANDALONE_DIRECTORY)/docker-compose.yaml up -d -run-arm: - @docker-compose -f $(STANDALONE_DIRECTORY)/docker-compose.arm.yaml up -d - down-x86: @docker-compose -f $(STANDALONE_DIRECTORY)/docker-compose.yaml down -v -down-arm: - @docker-compose -f $(STANDALONE_DIRECTORY)/docker-compose.arm.yaml down -v - clear-standalone-data: @docker run --rm -v "$(PWD)/$(STANDALONE_DIRECTORY)/data:/pwd" busybox \ sh -c "cd /pwd && rm -rf alertmanager/* clickhouse/* signoz/*" From 61b79742dc397de0c7c130057181637dbd838f30 Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 17:17:37 +0530 Subject: [PATCH 08/43] v4 --- CONTRIBUTING.md | 192 +++++++++++++++++++++++++++++++----------------- 1 file changed, 124 insertions(+), 68 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 8a3daaabcc..3b2bf060d8 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -6,18 +6,30 @@ Thank you for your interest in contributing to our project. Whether it's a bug r Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution. +## Finding contributions to work on πŸ’¬ + +Looking at the existing issues is a great way to find something to contribute on. +Also, have a look at these [good first issues labels](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with. + ## Sections: -- [General Guidelines](#1-general-instructions) -- [How to Contribute](#2-how-to-contribute) -- [Develop Frontend](#3-develop-frontend) +- [General Instructions](#1-general-instructions-) + - [For Creating Issue(s)](#11-for-creating-issues) + - [For Pull Requests(s)](#12-for-pull-requests) +- [How to Contribute](#2-how-to-contribute-%EF%B8%8F) +- [Develop Frontend](#3-develop-frontend-) - [Contribute to Frontend with Docker installation of SigNoz](#31-contribute-to-frontend-with-docker-installation-of-signoz) - [Contribute to Frontend without installing SigNoz backend](#32-contribute-to-frontend-without-installing-signoz-backend) -- [Contribute to Query-Service](#4-contribute-to-query-service) -- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart) +- [Contribute to Backend (Query-Service)](#4-contribute-to-backend-query-service-) + - [To run ClickHouse setup](#41-to-run-clickhouse-setup-recommended-for-local-development) +- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart-) + - [To run helm chart for local development](#51-to-run-helm-chart-for-local-development) # 1. General Instructions πŸ“ -Before making any significant changes and before filing an issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed), issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can. +## 1.1 For Creating Issue(s) +Before making any significant changes and before filing a new issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed) issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can. + +**Issue Types** - [Bug Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=) | [Feature Request](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=) | [Performance Issue Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=performance-issue-report.md&title=) | [Report a Security Vulnerability](https://github.com/SigNoz/signoz/security/policy) #### Details like these are incredibly useful: @@ -33,12 +45,16 @@ Before making any significant changes and before filing an issue, please check [ - Any modifications you've made relevant to the bug🐞. - Anything unusual about your environment or deployment. +Discussing your proposed changes ahead of time will make the contribution +process smooth for everyone πŸ™Œ. + + **[`^top^`](#)** +
-Discussing your proposed changes ahead of time will make the contribution -process smooth for everyone. +## 1.2 For Pull Request(s) -Once the approach is agreed uponβœ…, make your changes +Once the approach is agreed upon βœ…, make your changes and open a Pull Request(s). **Note:** Unless your change is small, **please** consider submitting different Pull Rrequest(s): @@ -48,11 +64,11 @@ and open a Pull Request(s). * This PR is usually trivial to review, so the size limit does not apply to it. * 2️⃣ Second PR should include the concrete implementation of the component. If the - size of this PR is larger than the recommended size, consider splitting it into + size of this PR is larger than the recommended size, consider **splitting** βš”οΈ it into multiple PRs. * If there are multiple sub-component then ideally each one should be implemented as - a separate pull request. -* Last PR should include changes to any user-facing documentation. And should include + a **separate** pull request. +* Last PR should include changes to **any user-facing documentation.** And should include end-to-end tests if applicable. The component must be enabled only after sufficient testing, and there is enough confidence in the stability and quality of the component. @@ -60,6 +76,7 @@ and open a Pull Request(s). You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack). +### Pointers: - If you find any **bugs** β†’ please create an **issue.** - If you find anything **missing** in documentation β†’ you can create an issue with the label **`documentation`**. - If you want to build any **new feature** β†’ please create an issue with the label **`enhancement`**. @@ -69,23 +86,24 @@ You can always reach out to `ankit@signoz.io` to understand more about the repo ### Conventions to follow when submitting Commits and Pull Request(s). -- We try to follow [Conventional Commits.](https://www.conventionalcommits.org/en/v1.0.0/) -, more specifically the commits and PRs should have type specifiers prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea. +We try to follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/), more specifically the commits and PRs **should have type specifiers** prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea. e.g. If you are submitting a fix for an issue in frontend, the PR name should be prefixed with **`fix(FE):`** - Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows. -- Feel free to ping us on [#contributing](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [#contributing-frontend](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :) +- Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :) + **[`^top^`](#)** +
# 2. How to Contribute πŸ™‹πŸ»β€β™‚οΈ #### There are primarily 2 areas in which you can contribute to SigNoz -- **Frontend** (Written in Typescript, React) -- **Backend** (Query Service, written in Go) +- [**Frontend**](#3-develop-frontend-) (Written in Typescript, React) +- [**Backend**]() (Query Service, written in Go) Depending upon your area of expertise & interest, you can choose one or more to contribute. Below are detailed instructions to contribute in each area. @@ -93,66 +111,92 @@ Depending upon your area of expertise & interest, you can choose one or more to ⚠️ If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. + **[`^top^`](#)** +
-# 3. Develop Frontend 🌝 +# 3. Develop Frontend 🌚 -Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend) +**Need to Update: [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend)** ### 3.1 Contribute to Frontend with Docker installation of SigNoz -- `git clone https://github.com/SigNoz/signoz.git && cd signoz` -- comment out frontend service section at `deploy/docker/clickhouse-setup/docker-compose.yaml#L62` -- run `cd deploy` to move to deploy directory -- Install signoz locally without the frontend - - Add below configuration to query-service section at `docker/clickhouse-setup/docker-compose.yaml#L38` - +- Clone the SigNoz repository and cd into signoz directory, + ``` + git clone https://github.com/SigNoz/signoz.git && cd signoz + ``` +- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/a09a4c264e49694c331c08c3dd15cc2bf6dc9106/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) +- run `cd deploy` to move to deploy directory, +- Install signoz locally **without** the frontend, + - Add below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/a09a4c264e49694c331c08c3dd15cc2bf6dc9106/deploy/docker/clickhouse-setup/docker-compose.yaml#L47) ```docker ports: - "8080:8080" ``` - - If you are using x86_64 processors (All Intel/AMD processors) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d` - - If you are on arm64 processors (Apple M1 Macbooks) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d` -- `cd ../frontend` and change baseURL to `http://localhost:8080` in file `src/constants/env.ts` -- `yarn install` -- `yarn dev` + + - If you are using `x86_64` processors (All Intel/AMD processors) run + ``` + sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d + ``` + - If you are on `arm64` processors (Apple M1 Macbooks) run + ``` + sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d + ``` +- `cd ../frontend` and change baseURL to `http://localhost:8080` in file [`src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) +- Next, + ``` + yarn install + yarn dev + ``` -> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` +### Important Notes: +The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh) + + **[`^top^`](#)** ### 3.2 Contribute to Frontend without installing SigNoz backend -If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend. Please ping us in the #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `` +If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend. -- `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend` -- Create a file `.env` with `FRONTEND_API_ENDPOINT=` -- `yarn install` -- `yarn dev` +- Clone the SigNoz repository and cd into signoz/frontend directory, + ``` + git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend + ```` +- Create a file `.env` in the `frontend` directory with `FRONTEND_API_ENDPOINT=` +- Next, + ``` + yarn install + yarn dev + ``` -**_Frontend should now be accessible at `http://localhost:3301/application`_** +Please ping us in the [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) channel or ask `@Prashant Shahi` in our [Slack Community](https://signoz.io/slack) and we will DM you with ``. +**Frontend should now be accessible at** [`http://localhost:3301/application`](http://localhost:3301/application) + + **[`^top^`](#)** +
-# 4. Contribute to Backend (Query-Service) πŸŒ• +# 4. Contribute to Backend (Query-Service) πŸŒ‘ -Need to update: [**https://github.com/SigNoz/signoz/tree/main/pkg/query-service**](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) +**Need to Update:** [**https://github.com/SigNoz/signoz/tree/main/pkg/query-service**](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) ### 4.1 To run ClickHouse setup (recommended for local development) -- Clone SigNoz, +- Clone the SigNoz repository and cd into signoz directory, ``` -git clone https://github.com/SigNoz/signoz.git +git clone https://github.com/SigNoz/signoz.git && cd signoz ``` -- run `cd signoz` to move to signoz directory, - run `sudo make dev-setup` to configure local setup to run query-service, -- comment out frontend service section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) -- comment out query-service section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) -- add below configuration to clickhouse section at [`docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) -```docker - expose: - - 9000 - ports: - - 9001:9000 -``` +- comment out frontend service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) +- comment out query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) +- add below configuration to clickhouse section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) + ``` + expose: + - 9000 + ports: + - 9001:9000 + ``` - run `cd pkg/query-service/` to move to `query-service` directory, - Open [`./constants/constants.go`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go) @@ -163,18 +207,19 @@ git clone https://github.com/SigNoz/signoz.git - If you are using `x86_64` processors (All Intel/AMD processors) run `sudo make run-x86` - If you are on `arm64` processors (Apple M1 Macs) run `sudo make run-arm` -#### Run locally -```console +#### Run locally, +``` ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go ``` +### Important Note: +The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh) -> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` -**_Query Service should now be available at [http://localhost:8080](http://localhost:8080)_** -> If you want to see how the frontend plays with query service, you can run the frontend also in your local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` +**Query Service should now be available at** [`http://localhost:8080`](http://localhost:8080) + +If you want to see how the frontend plays with query service, you can run the frontend also in your local env with the baseURL changed to `http://localhost:8080` in file [`frontend/src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) as the `query-service` is now running at port `8080`. ---- + **[`^top^`](#)** +
-# 5. Contribute to SigNoz Helm Chart +# 5. Contribute to SigNoz Helm Chart πŸ“Š -Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts). +**Need to Update: [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts).** ### 5.1 To run helm chart for local development -- run `git clone https://github.com/SigNoz/charts.git` followed by `cd charts` -- it is recommended to use lightweight kubernetes (k8s) cluster for local development: +- Clone the SigNoz repository and cd into charts directory, +``` +git clone https://github.com/SigNoz/charts.git && cd charts +``` +- It is recommended to use lightweight kubernetes (k8s) cluster for local development: - [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation) - [k3d](https://k3d.io/#installation) - [minikube](https://minikube.sigs.k8s.io/docs/start/) -- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster -- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace. -- run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301) +- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster, +- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace, +- next run, +``` +kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301 +``` +to make SigNoz UI available at [localhost:3301](http://localhost:3301) -**To install the HotROD sample app:** +**5.1.1 To install the HotROD sample app:** ```bash curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \ | HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash ``` -**To load data with the HotROD sample app:** +**5.1.2 To load data with the HotROD sample app:** ```bash kubectl -n sample-application run strzal --image=djbingham/curl \ @@ -215,7 +269,7 @@ kubectl -n sample-application run strzal --image=djbingham/curl \ 'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm ``` -**To stop the load generation:** +**5.1.3 To stop the load generation:** ```bash kubectl -n sample-application run strzal --image=djbingham/curl \ @@ -223,13 +277,15 @@ kubectl -n sample-application run strzal --image=djbingham/curl \ http://locust-master:8089/stop ``` -**To delete the HotROD sample app:** +**5.1.4 To delete the HotROD sample app:** ```bash curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \ | HOTROD_NAMESPACE=sample-application bash ``` + **[`^top^`](#)** + --- Again, feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) From ef69505bf9a21b7438985161d1c9edf6d4935f4e Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 17:22:31 +0530 Subject: [PATCH 09/43] remove arm version of docker-compose file --- CONTRIBUTING.md | 10 +++------- 1 file changed, 3 insertions(+), 7 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 3b2bf060d8..ae2f595b2f 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -125,23 +125,19 @@ Depending upon your area of expertise & interest, you can choose one or more to ``` git clone https://github.com/SigNoz/signoz.git && cd signoz ``` -- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/a09a4c264e49694c331c08c3dd15cc2bf6dc9106/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) +- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) - run `cd deploy` to move to deploy directory, - Install signoz locally **without** the frontend, - - Add below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/a09a4c264e49694c331c08c3dd15cc2bf6dc9106/deploy/docker/clickhouse-setup/docker-compose.yaml#L47) + - Add below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L47) ```docker ports: - "8080:8080" ``` - - If you are using `x86_64` processors (All Intel/AMD processors) run + - Next run, ``` sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d ``` - - If you are on `arm64` processors (Apple M1 Macbooks) run - ``` - sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d - ``` - `cd ../frontend` and change baseURL to `http://localhost:8080` in file [`src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) - Next, ``` From ab52538e91e622e4e7239a965d91d0a76b1cb356 Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 17:43:23 +0530 Subject: [PATCH 10/43] v5 --- CONTRIBUTING.md | 24 ++++++++++++++++++++++-- 1 file changed, 22 insertions(+), 2 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index ae2f595b2f..085e40742f 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -54,8 +54,23 @@ process smooth for everyone πŸ™Œ. ## 1.2 For Pull Request(s) -Once the approach is agreed upon βœ…, make your changes -and open a Pull Request(s). +Contributions via pull requests are much appreciated. Once the approach is agreed upon βœ…, make your changes and open a Pull Request(s). +Before sending us a pull request, please ensure that, + +- Fork the SigNoz repo on GitHub, clone it on your machine. +- Create a branch with your changes. +- You are working against the latest source on the `develop` branch. +- Modify the source; please focus only on the specific change you are contributing. +- Ensure local tests pass. +- Commit to your fork using clear commit messages. +- Send us a pull request, answering any default questions in the pull request interface. +- Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation +- Once you've pushed your commits to GitHub, make sure that your branch can be auto-merged (there are no merge conflicts). If not, on your computer, merge main into your branch, resolve any merge conflicts, make sure everything still runs correctly and passes all the tests, and then push up those changes. +- Once the change has been approved and merged, we will inform you in a comment. + + +GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and +[creating a pull request](https://help.github.com/articles/creating-a-pull-request/). **Note:** Unless your change is small, **please** consider submitting different Pull Rrequest(s): @@ -284,5 +299,10 @@ curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-del --- +## License + +By contributing to FBShipIt, you agree that your contributions will be licensed under its MIT license. + Again, feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) +Thank You! From 0ceaa56679545b3663cbb74f8367b8fb33ae8dd4 Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 17:46:48 +0530 Subject: [PATCH 11/43] v6 --- CONTRIBUTING.md | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 085e40742f..2dab66f09d 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -11,6 +11,7 @@ Please read through this document before submitting any issues or pull requests Looking at the existing issues is a great way to find something to contribute on. Also, have a look at these [good first issues labels](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with. + ## Sections: - [General Instructions](#1-general-instructions-) - [For Creating Issue(s)](#11-for-creating-issues) @@ -299,6 +300,17 @@ curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-del --- +## Other ways to contribute + +There are many other ways to get involved with the community and to participate in this project: + +- Use the product, submitting GitHub issues when a problem is found. +- Help code review pull requests and participate in issue threads. +- Submit a new feature request as an issue. +- Help answer questions on forums such as Stack Overflow and [SigNoz Community Slack Channel](https://signoz.io/slack). +- Tell others about the project on Twitter, your blog, etc. + + ## License By contributing to FBShipIt, you agree that your contributions will be licensed under its MIT license. From 3dc1dc970fe746476572f7dbe00162807b3f0907 Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 18:00:46 +0530 Subject: [PATCH 12/43] v6 --- CONTRIBUTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 2dab66f09d..9bee883ec3 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -2,7 +2,7 @@ #### Welcome to SigNoz Contributing section πŸŽ‰ -Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community. +Hi there! We're thrilled that you'd like to contribute to this project, thank you for your interest. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community. Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution. From 3e2a6df2005a8d7c9ce0e1b12fd2fcfec406e2ef Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Wed, 13 Jul 2022 18:08:02 +0530 Subject: [PATCH 13/43] Update CONTRIBUTING.md --- CONTRIBUTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 9bee883ec3..3809a5dda9 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -313,7 +313,7 @@ There are many other ways to get involved with the community and to participate ## License -By contributing to FBShipIt, you agree that your contributions will be licensed under its MIT license. +By contributing to SigNoz, you agree that your contributions will be licensed under its MIT license. Again, feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) From 5554cce379347b9c4b6adae4e5c6850a5b6400c1 Mon Sep 17 00:00:00 2001 From: Palash Date: Wed, 13 Jul 2022 19:49:27 +0530 Subject: [PATCH 14/43] feat: exception page is updated (#1376) * chore: all error utils is added * chore: error page list is added with total page and other handlings * test: unit test case for order is added --- frontend/package.json | 1 + frontend/src/api/errors/getAll.ts | 5 +- .../api/errors/getByErrorTypeAndService.ts | 7 +- frontend/src/api/errors/getById.ts | 8 +- frontend/src/api/errors/getErrorCounts.ts | 29 +++ frontend/src/api/errors/getNextPrevId.ts | 29 +++ frontend/src/container/AllError/index.tsx | 165 +++++++++++++++--- frontend/src/container/AllError/utils.test.ts | 28 +++ frontend/src/container/AllError/utils.ts | 89 ++++++++++ frontend/src/container/ErrorDetails/index.tsx | 86 +++++---- frontend/src/lib/createQueryParams.ts | 4 +- frontend/src/pages/ErrorDetails/index.tsx | 103 ++++++----- frontend/src/pages/ErrorDetails/utils.ts | 8 + frontend/src/types/api/errors/getAll.ts | 13 ++ frontend/src/types/api/errors/getByErrorId.ts | 9 + .../api/errors/getByErrorTypeAndService.ts | 13 +- .../errors/{getById.ts => getErrorCounts.ts} | 7 +- .../src/types/api/errors/getNextPrevId.ts | 13 ++ frontend/yarn.lock | 5 + 19 files changed, 481 insertions(+), 141 deletions(-) create mode 100644 frontend/src/api/errors/getErrorCounts.ts create mode 100644 frontend/src/api/errors/getNextPrevId.ts create mode 100644 frontend/src/container/AllError/utils.test.ts create mode 100644 frontend/src/container/AllError/utils.ts create mode 100644 frontend/src/pages/ErrorDetails/utils.ts create mode 100644 frontend/src/types/api/errors/getByErrorId.ts rename frontend/src/types/api/errors/{getById.ts => getErrorCounts.ts} (53%) create mode 100644 frontend/src/types/api/errors/getNextPrevId.ts diff --git a/frontend/package.json b/frontend/package.json index f93bc9684c..f3ccdcc8c8 100644 --- a/frontend/package.json +++ b/frontend/package.json @@ -81,6 +81,7 @@ "style-loader": "1.3.0", "styled-components": "^5.2.1", "terser-webpack-plugin": "^5.2.5", + "timestamp-nano": "^1.0.0", "ts-node": "^10.2.1", "tsconfig-paths-webpack-plugin": "^3.5.1", "typescript": "^4.0.5", diff --git a/frontend/src/api/errors/getAll.ts b/frontend/src/api/errors/getAll.ts index dcd8aa8e73..7014e52a56 100644 --- a/frontend/src/api/errors/getAll.ts +++ b/frontend/src/api/errors/getAll.ts @@ -10,9 +10,8 @@ const getAll = async ( ): Promise | ErrorResponse> => { try { const response = await axios.get( - `/errors?${createQueryParams({ - start: props.start.toString(), - end: props.end.toString(), + `/listErrors?${createQueryParams({ + ...props, })}`, ); diff --git a/frontend/src/api/errors/getByErrorTypeAndService.ts b/frontend/src/api/errors/getByErrorTypeAndService.ts index 6a2c6964d9..c9a710fd72 100644 --- a/frontend/src/api/errors/getByErrorTypeAndService.ts +++ b/frontend/src/api/errors/getByErrorTypeAndService.ts @@ -10,11 +10,8 @@ const getByErrorType = async ( ): Promise | ErrorResponse> => { try { const response = await axios.get( - `/errorWithType?${createQueryParams({ - start: props.start.toString(), - end: props.end.toString(), - serviceName: props.serviceName, - errorType: props.errorType, + `/errorFromGroupID?${createQueryParams({ + ...props, })}`, ); diff --git a/frontend/src/api/errors/getById.ts b/frontend/src/api/errors/getById.ts index 3ab7c4aa60..ab0bae3f8a 100644 --- a/frontend/src/api/errors/getById.ts +++ b/frontend/src/api/errors/getById.ts @@ -3,17 +3,15 @@ import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { AxiosError } from 'axios'; import createQueryParams from 'lib/createQueryParams'; import { ErrorResponse, SuccessResponse } from 'types/api'; -import { PayloadProps, Props } from 'types/api/errors/getById'; +import { PayloadProps, Props } from 'types/api/errors/getByErrorId'; const getById = async ( props: Props, ): Promise | ErrorResponse> => { try { const response = await axios.get( - `/errorWithId?${createQueryParams({ - start: props.start.toString(), - end: props.end.toString(), - errorId: props.errorId, + `/errorFromErrorID?${createQueryParams({ + ...props, })}`, ); diff --git a/frontend/src/api/errors/getErrorCounts.ts b/frontend/src/api/errors/getErrorCounts.ts new file mode 100644 index 0000000000..4992a6d391 --- /dev/null +++ b/frontend/src/api/errors/getErrorCounts.ts @@ -0,0 +1,29 @@ +import axios from 'api'; +import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; +import { AxiosError } from 'axios'; +import createQueryParams from 'lib/createQueryParams'; +import { ErrorResponse, SuccessResponse } from 'types/api'; +import { PayloadProps, Props } from 'types/api/errors/getErrorCounts'; + +const getErrorCounts = async ( + props: Props, +): Promise | ErrorResponse> => { + try { + const response = await axios.get( + `/countErrors?${createQueryParams({ + ...props, + })}`, + ); + + return { + statusCode: 200, + error: null, + message: response.data.message, + payload: response.data, + }; + } catch (error) { + return ErrorResponseHandler(error as AxiosError); + } +}; + +export default getErrorCounts; diff --git a/frontend/src/api/errors/getNextPrevId.ts b/frontend/src/api/errors/getNextPrevId.ts new file mode 100644 index 0000000000..07798c548e --- /dev/null +++ b/frontend/src/api/errors/getNextPrevId.ts @@ -0,0 +1,29 @@ +import axios from 'api'; +import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; +import { AxiosError } from 'axios'; +import createQueryParams from 'lib/createQueryParams'; +import { ErrorResponse, SuccessResponse } from 'types/api'; +import { PayloadProps, Props } from 'types/api/errors/getNextPrevId'; + +const getErrorCounts = async ( + props: Props, +): Promise | ErrorResponse> => { + try { + const response = await axios.get( + `/nextPrevErrorIDs?${createQueryParams({ + ...props, + })}`, + ); + + return { + statusCode: 200, + error: null, + message: response.data.message, + payload: response.data, + }; + } catch (error) { + return ErrorResponseHandler(error as AxiosError); + } +}; + +export default getErrorCounts; diff --git a/frontend/src/container/AllError/index.tsx b/frontend/src/container/AllError/index.tsx index 51f47c1104..58b9c1201a 100644 --- a/frontend/src/container/AllError/index.tsx +++ b/frontend/src/container/AllError/index.tsx @@ -1,31 +1,85 @@ -import { notification, Table, Tooltip, Typography } from 'antd'; +import { notification, Table, TableProps, Tooltip, Typography } from 'antd'; import { ColumnsType } from 'antd/lib/table'; import getAll from 'api/errors/getAll'; +import getErrorCounts from 'api/errors/getErrorCounts'; import ROUTES from 'constants/routes'; import dayjs from 'dayjs'; -import React, { useEffect } from 'react'; +import createQueryParams from 'lib/createQueryParams'; +import history from 'lib/history'; +import React, { useEffect, useMemo } from 'react'; import { useTranslation } from 'react-i18next'; -import { useQuery } from 'react-query'; +import { useQueries } from 'react-query'; import { useSelector } from 'react-redux'; -import { Link } from 'react-router-dom'; +import { Link, useLocation } from 'react-router-dom'; import { AppState } from 'store/reducers'; -import { Exception } from 'types/api/errors/getAll'; +import { ErrorResponse, SuccessResponse } from 'types/api'; +import { Exception, PayloadProps } from 'types/api/errors/getAll'; import { GlobalReducer } from 'types/reducer/globalTime'; +import { + getDefaultOrder, + getNanoSeconds, + getOffSet, + getOrder, + getOrderParams, + getUpdatePageSize, + urlKey, +} from './utils'; + function AllErrors(): JSX.Element { - const { maxTime, minTime } = useSelector( + const { maxTime, minTime, loading } = useSelector( (state) => state.globalTime, ); + const { search, pathname } = useLocation(); + const params = useMemo(() => new URLSearchParams(search), [search]); const { t } = useTranslation(['common']); - const { isLoading, data } = useQuery(['getAllError', [maxTime, minTime]], { - queryFn: () => - getAll({ - end: maxTime, - start: minTime, - }), - }); + const updatedOrder = getOrder(params.get(urlKey.order)); + const getUpdatedOffset = getOffSet(params.get(urlKey.offset)); + const getUpdatedParams = getOrderParams(params.get(urlKey.orderParam)); + const getUpdatedPageSize = getUpdatePageSize(params.get(urlKey.pageSize)); + + const updatedPath = useMemo( + () => + `${pathname}?${createQueryParams({ + order: updatedOrder, + offset: getUpdatedOffset, + orderParam: getUpdatedParams, + pageSize: getUpdatedPageSize, + })}`, + [ + pathname, + updatedOrder, + getUpdatedOffset, + getUpdatedParams, + getUpdatedPageSize, + ], + ); + + const [{ isLoading, data }, errorCountResponse] = useQueries([ + { + queryKey: ['getAllErrors', updatedPath, maxTime, minTime], + queryFn: (): Promise | ErrorResponse> => + getAll({ + end: maxTime, + start: minTime, + order: updatedOrder, + limit: getUpdatedPageSize, + offset: getUpdatedOffset, + orderParam: getUpdatedParams, + }), + enabled: !loading, + }, + { + queryKey: ['getErrorCounts', maxTime, minTime], + queryFn: (): Promise> => + getErrorCounts({ + end: maxTime, + start: minTime, + }), + }, + ]); useEffect(() => { if (data?.error) { @@ -35,11 +89,9 @@ function AllErrors(): JSX.Element { } }, [data?.error, data?.payload, t]); - const getDateValue = (value: string): JSX.Element => { - return ( - {dayjs(value).format('DD/MM/YYYY HH:mm:ss A')} - ); - }; + const getDateValue = (value: string): JSX.Element => ( + {dayjs(value).format('DD/MM/YYYY HH:mm:ss A')} + ); const columns: ColumnsType = [ { @@ -49,14 +101,22 @@ function AllErrors(): JSX.Element { render: (value, record): JSX.Element => ( value}> {value} ), - sorter: (a, b): number => - a.exceptionType.charCodeAt(0) - b.exceptionType.charCodeAt(0), + sorter: true, + defaultSortOrder: getDefaultOrder( + getUpdatedParams, + updatedOrder, + 'exceptionType', + ), }, { title: 'Error Message', @@ -78,39 +138,86 @@ function AllErrors(): JSX.Element { title: 'Count', dataIndex: 'exceptionCount', key: 'exceptionCount', - sorter: (a, b): number => a.exceptionCount - b.exceptionCount, + sorter: true, + defaultSortOrder: getDefaultOrder( + getUpdatedParams, + updatedOrder, + 'exceptionCount', + ), }, { title: 'Last Seen', dataIndex: 'lastSeen', key: 'lastSeen', render: getDateValue, - sorter: (a, b): number => - dayjs(b.lastSeen).isBefore(dayjs(a.lastSeen)) === true ? 1 : 0, + sorter: true, + defaultSortOrder: getDefaultOrder( + getUpdatedParams, + updatedOrder, + 'lastSeen', + ), }, { title: 'First Seen', dataIndex: 'firstSeen', key: 'firstSeen', render: getDateValue, - sorter: (a, b): number => - dayjs(b.firstSeen).isBefore(dayjs(a.firstSeen)) === true ? 1 : 0, + sorter: true, + defaultSortOrder: getDefaultOrder( + getUpdatedParams, + updatedOrder, + 'firstSeen', + ), }, { title: 'Application', dataIndex: 'serviceName', key: 'serviceName', - sorter: (a, b): number => - a.serviceName.charCodeAt(0) - b.serviceName.charCodeAt(0), + sorter: true, + defaultSortOrder: getDefaultOrder( + getUpdatedParams, + updatedOrder, + 'serviceName', + ), }, ]; + const onChangeHandler: TableProps['onChange'] = ( + paginations, + _, + sorter, + ) => { + if (!Array.isArray(sorter)) { + const { current = 0, pageSize = 0 } = paginations; + const { columnKey = '', order } = sorter; + const updatedOrder = order === 'ascend' ? 'ascending' : 'descending'; + + history.replace( + `${pathname}?${createQueryParams({ + order: updatedOrder, + offset: current - 1, + orderParam: columnKey, + pageSize, + })}`, + ); + } + }; + return ( ); } diff --git a/frontend/src/container/AllError/utils.test.ts b/frontend/src/container/AllError/utils.test.ts new file mode 100644 index 0000000000..b0d302f01b --- /dev/null +++ b/frontend/src/container/AllError/utils.test.ts @@ -0,0 +1,28 @@ +import { isOrder, isOrderParams } from './utils'; + +describe('Error utils', () => { + test('Valid OrderBy Params', () => { + expect(isOrderParams('serviceName')).toBe(true); + expect(isOrderParams('exceptionCount')).toBe(true); + expect(isOrderParams('lastSeen')).toBe(true); + expect(isOrderParams('firstSeen')).toBe(true); + expect(isOrderParams('exceptionType')).toBe(true); + }); + + test('Invalid OrderBy Params', () => { + expect(isOrderParams('invalid')).toBe(false); + expect(isOrderParams(null)).toBe(false); + expect(isOrderParams('')).toBe(false); + }); + + test('Valid Order', () => { + expect(isOrder('ascending')).toBe(true); + expect(isOrder('descending')).toBe(true); + }); + + test('Invalid Order', () => { + expect(isOrder('invalid')).toBe(false); + expect(isOrder(null)).toBe(false); + expect(isOrder('')).toBe(false); + }); +}); diff --git a/frontend/src/container/AllError/utils.ts b/frontend/src/container/AllError/utils.ts new file mode 100644 index 0000000000..747c75cf58 --- /dev/null +++ b/frontend/src/container/AllError/utils.ts @@ -0,0 +1,89 @@ +import { SortOrder } from 'antd/lib/table/interface'; +import Timestamp from 'timestamp-nano'; +import { Order, OrderBy } from 'types/api/errors/getAll'; + +export const isOrder = (order: string | null): order is Order => + !!(order === 'ascending' || order === 'descending'); + +export const urlKey = { + order: 'order', + offset: 'offset', + orderParam: 'orderParam', + pageSize: 'pageSize', +}; + +export const isOrderParams = (orderBy: string | null): orderBy is OrderBy => { + return !!( + orderBy === 'serviceName' || + orderBy === 'exceptionCount' || + orderBy === 'lastSeen' || + orderBy === 'firstSeen' || + orderBy === 'exceptionType' + ); +}; + +export const getOrder = (order: string | null): Order => { + if (isOrder(order)) { + return order; + } + return 'ascending'; +}; + +export const getLimit = (limit: string | null): number => { + if (limit) { + return parseInt(limit, 10); + } + return 10; +}; + +export const getOffSet = (offset: string | null): number => { + if (offset && typeof offset === 'string') { + return parseInt(offset, 10); + } + return 0; +}; + +export const getOrderParams = (order: string | null): OrderBy => { + if (isOrderParams(order)) { + return order; + } + return 'serviceName'; +}; + +export const getDefaultOrder = ( + orderBy: OrderBy, + order: Order, + data: OrderBy, + // eslint-disable-next-line sonarjs/cognitive-complexity +): SortOrder | undefined => { + if (orderBy === 'exceptionType' && data === 'exceptionType') { + return order === 'ascending' ? 'ascend' : 'descend'; + } + if (orderBy === 'serviceName' && data === 'serviceName') { + return order === 'ascending' ? 'ascend' : 'descend'; + } + if (orderBy === 'exceptionCount' && data === 'exceptionCount') { + return order === 'ascending' ? 'ascend' : 'descend'; + } + if (orderBy === 'lastSeen' && data === 'lastSeen') { + return order === 'ascending' ? 'ascend' : 'descend'; + } + if (orderBy === 'firstSeen' && data === 'firstSeen') { + return order === 'ascending' ? 'ascend' : 'descend'; + } + return undefined; +}; + +export const getNanoSeconds = (date: string): number => { + return ( + parseInt((new Date(date).getTime() / 1e3).toString(), 10) * 1e9 + + Timestamp.fromString(date).getNano() + ); +}; + +export const getUpdatePageSize = (pageSize: string | null): number => { + if (pageSize) { + return parseInt(pageSize, 10); + } + return 10; +}; diff --git a/frontend/src/container/ErrorDetails/index.tsx b/frontend/src/container/ErrorDetails/index.tsx index a5f8efe756..ea8a3c2e3e 100644 --- a/frontend/src/container/ErrorDetails/index.tsx +++ b/frontend/src/container/ErrorDetails/index.tsx @@ -1,25 +1,49 @@ import { Button, Divider, notification, Space, Table, Typography } from 'antd'; +import getNextPrevId from 'api/errors/getNextPrevId'; import Editor from 'components/Editor'; +import { getNanoSeconds } from 'container/AllError/utils'; import dayjs from 'dayjs'; import history from 'lib/history'; +import { urlKey } from 'pages/ErrorDetails/utils'; import React, { useMemo, useState } from 'react'; import { useTranslation } from 'react-i18next'; +import { useQuery } from 'react-query'; import { useLocation } from 'react-router-dom'; import { PayloadProps as GetByErrorTypeAndServicePayload } from 'types/api/errors/getByErrorTypeAndService'; -import { PayloadProps } from 'types/api/errors/getById'; import { DashedContainer, EditorContainer, EventContainer } from './styles'; function ErrorDetails(props: ErrorDetailsProps): JSX.Element { const { idPayload } = props; - const [isLoading, setLoading] = useState(false); const { t } = useTranslation(['errorDetails', 'common']); - const { search } = useLocation(); - const params = new URLSearchParams(search); - const queryErrorId = params.get('errorId'); - const serviceName = params.get('serviceName'); - const errorType = params.get('errorType'); + + const params = useMemo(() => new URLSearchParams(search), [search]); + + const errorId = params.get(urlKey.errorId); + const serviceName = params.get(urlKey.serviceName); + const errorType = params.get(urlKey.exceptionType); + const timestamp = params.get(urlKey.timestamp); + + const { data: nextPrevData, status: nextPrevStatus } = useQuery( + [ + idPayload.errorId, + idPayload.groupID, + idPayload.timestamp, + errorId, + serviceName, + errorType, + timestamp, + ], + { + queryFn: () => + getNextPrevId({ + errorID: errorId || idPayload.errorId, + groupID: idPayload.groupID, + timestamp: timestamp || getNanoSeconds(idPayload.timestamp).toString(), + }), + }, + ); const errorDetail = idPayload; @@ -48,34 +72,34 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element { 'errorId', 'timestamp', 'exceptionMessage', - 'newerErrorId', - 'olderErrorId', + 'exceptionEscaped', ], [], ); - const onClickErrorIdHandler = async (id: string): Promise => { + const onClickErrorIdHandler = async ( + id: string, + timespamp: string, + ): Promise => { try { - setLoading(true); - if (id.length === 0) { notification.error({ message: 'Error Id cannot be empty', }); - setLoading(false); return; } - setLoading(false); - - history.push( - `${history.location.pathname}?errorId=${id}&serviceName=${serviceName}&errorType=${errorType}`, + history.replace( + `${history.location.pathname}?${urlKey.serviceName}=${serviceName}&${ + urlKey.exceptionType + }=${errorType}&groupId=${idPayload.groupID}×tamp=${getNanoSeconds( + timespamp, + )}&errorId=${id}`, ); } catch (error) { notification.error({ message: t('something_went_wrong'), }); - setLoading(false); } }; @@ -106,25 +130,25 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element {
- - + ); } interface EditRulesProps { - initialData: PayloadProps['data']; - ruleId: string; + initialValue: AlertDef; + ruleId: number; } export default EditRules; diff --git a/frontend/src/container/FormAlertRules/BasicInfo.tsx b/frontend/src/container/FormAlertRules/BasicInfo.tsx new file mode 100644 index 0000000000..2d1ce5eac4 --- /dev/null +++ b/frontend/src/container/FormAlertRules/BasicInfo.tsx @@ -0,0 +1,101 @@ +import { Select } from 'antd'; +import FormItem from 'antd/lib/form/FormItem'; +import React from 'react'; +import { useTranslation } from 'react-i18next'; +import { AlertDef, Labels } from 'types/api/alerts/def'; + +import LabelSelect from './labels'; +import { + FormContainer, + InputSmall, + SeveritySelect, + StepHeading, + TextareaMedium, +} from './styles'; + +const { Option } = Select; + +interface BasicInfoProps { + alertDef: AlertDef; + setAlertDef: (a: AlertDef) => void; +} + +function BasicInfo({ alertDef, setAlertDef }: BasicInfoProps): JSX.Element { + // init namespace for translations + const { t } = useTranslation('rules'); + + return ( + <> + {t('alert_form_step3')} + + + { + const s = (value as string) || 'critical'; + setAlertDef({ + ...alertDef, + labels: { + ...alertDef.labels, + severity: s, + }, + }); + }} + > + + + + + + + + + { + setAlertDef({ + ...alertDef, + alert: e.target.value, + }); + }} + /> + + + { + setAlertDef({ + ...alertDef, + annotations: { + ...alertDef.annotations, + description: e.target.value, + }, + }); + }} + /> + + + { + setAlertDef({ + ...alertDef, + labels: { + ...l, + }, + }); + }} + initialValues={alertDef.labels} + /> + + + + ); +} + +export default BasicInfo; diff --git a/frontend/src/container/FormAlertRules/ChartPreview/index.tsx b/frontend/src/container/FormAlertRules/ChartPreview/index.tsx new file mode 100644 index 0000000000..d3634d8da1 --- /dev/null +++ b/frontend/src/container/FormAlertRules/ChartPreview/index.tsx @@ -0,0 +1,119 @@ +import { InfoCircleOutlined } from '@ant-design/icons'; +import { StaticLineProps } from 'components/Graph'; +import GridGraphComponent from 'container/GridGraphComponent'; +import { GRAPH_TYPES } from 'container/NewDashboard/ComponentsSlider'; +import { timePreferenceType } from 'container/NewWidget/RightContainer/timeItems'; +import { Time } from 'container/TopNav/DateTimeSelection/config'; +import getChartData from 'lib/getChartData'; +import React from 'react'; +import { useTranslation } from 'react-i18next'; +import { useQuery } from 'react-query'; +import { GetMetricQueryRange } from 'store/actions/dashboard/getQueryResults'; +import { Query } from 'types/api/dashboard/getAll'; +import { EQueryType } from 'types/common/dashboard'; + +import { ChartContainer, FailedMessageContainer } from './styles'; + +export interface ChartPreviewProps { + name: string; + query: Query | undefined; + graphType?: GRAPH_TYPES; + selectedTime?: timePreferenceType; + selectedInterval?: Time; + headline?: JSX.Element; + threshold?: number; +} + +function ChartPreview({ + name, + query, + graphType = 'TIME_SERIES', + selectedTime = 'GLOBAL_TIME', + selectedInterval = '5min', + headline, + threshold, +}: ChartPreviewProps): JSX.Element | null { + const { t } = useTranslation('rules'); + const staticLine: StaticLineProps | undefined = + threshold && threshold > 0 + ? { + yMin: threshold, + yMax: threshold, + borderColor: '#f14', + borderWidth: 1, + lineText: `${t('preview_chart_threshold_label')} (y=${threshold})`, + textColor: '#f14', + } + : undefined; + + const queryKey = JSON.stringify(query); + const queryResponse = useQuery({ + queryKey: ['chartPreview', queryKey, selectedInterval], + queryFn: () => + GetMetricQueryRange({ + query: query || { + queryType: 1, + promQL: [], + metricsBuilder: { + formulas: [], + queryBuilder: [], + }, + clickHouse: [], + }, + globalSelectedInterval: selectedInterval, + graphType, + selectedTime, + }), + enabled: + query != null && + (query.queryType !== EQueryType.PROM || + (query.promQL?.length > 0 && query.promQL[0].query !== '')), + }); + + const chartDataSet = queryResponse.isError + ? null + : getChartData({ + queryData: [ + { + queryData: queryResponse?.data?.payload?.data?.result + ? queryResponse?.data?.payload?.data?.result + : [], + }, + ], + }); + + return ( + + {headline} + {(queryResponse?.data?.error || queryResponse?.isError) && ( + + {' '} + {queryResponse?.data?.error || + queryResponse?.error || + t('preview_chart_unexpected_error')} + + )} + + {chartDataSet && !queryResponse.isError && ( + + )} + + ); +} + +ChartPreview.defaultProps = { + graphType: 'TIME_SERIES', + selectedTime: 'GLOBAL_TIME', + selectedInterval: '5min', + headline: undefined, + threshold: 0, +}; + +export default ChartPreview; diff --git a/frontend/src/container/FormAlertRules/ChartPreview/styles.ts b/frontend/src/container/FormAlertRules/ChartPreview/styles.ts new file mode 100644 index 0000000000..0f1617dc94 --- /dev/null +++ b/frontend/src/container/FormAlertRules/ChartPreview/styles.ts @@ -0,0 +1,28 @@ +import { Card, Tooltip } from 'antd'; +import styled from 'styled-components'; + +export const NotFoundContainer = styled.div` + display: flex; + justify-content: center; + align-items: center; + min-height: 55vh; +`; + +export const FailedMessageContainer = styled(Tooltip)` + position: absolute; + top: 10px; + left: 10px; +`; + +export const ChartContainer = styled(Card)` + border-radius: 4px; + &&& { + position: relative; + } + + .ant-card-body { + padding: 1.5rem 0; + height: 57vh; + /* padding-bottom: 2rem; */ + } +`; diff --git a/frontend/src/container/FormAlertRules/PromqlSection.tsx b/frontend/src/container/FormAlertRules/PromqlSection.tsx new file mode 100644 index 0000000000..129e5bb92d --- /dev/null +++ b/frontend/src/container/FormAlertRules/PromqlSection.tsx @@ -0,0 +1,49 @@ +import PromQLQueryBuilder from 'container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/query'; +import { IPromQLQueryHandleChange } from 'container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/types'; +import React from 'react'; +import { IPromQueries } from 'types/api/alerts/compositeQuery'; + +function PromqlSection({ + promQueries, + setPromQueries, +}: PromqlSectionProps): JSX.Element { + const handlePromQLQueryChange = ({ + query, + legend, + toggleDelete, + }: IPromQLQueryHandleChange): void => { + let promQuery = promQueries.A; + + // todo(amol): how to remove query, make it null? + if (query) promQuery.query = query; + if (legend) promQuery.legend = legend; + if (toggleDelete) { + promQuery = { + query: '', + legend: '', + name: 'A', + disabled: false, + }; + } + setPromQueries({ + A: { + ...promQuery, + }, + }); + }; + return ( + + ); +} + +interface PromqlSectionProps { + promQueries: IPromQueries; + setPromQueries: (p: IPromQueries) => void; +} + +export default PromqlSection; diff --git a/frontend/src/container/FormAlertRules/QuerySection.tsx b/frontend/src/container/FormAlertRules/QuerySection.tsx new file mode 100644 index 0000000000..e58cdc3ace --- /dev/null +++ b/frontend/src/container/FormAlertRules/QuerySection.tsx @@ -0,0 +1,288 @@ +import { PlusOutlined } from '@ant-design/icons'; +import { notification, Tabs } from 'antd'; +import MetricsBuilderFormula from 'container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/formula'; +import MetricsBuilder from 'container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/query'; +import { + IQueryBuilderFormulaHandleChange, + IQueryBuilderQueryHandleChange, +} from 'container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/types'; +import React, { useCallback } from 'react'; +import { useTranslation } from 'react-i18next'; +import { + IFormulaQueries, + IMetricQueries, + IPromQueries, +} from 'types/api/alerts/compositeQuery'; +import { EAggregateOperator, EQueryType } from 'types/common/dashboard'; + +import PromqlSection from './PromqlSection'; +import { FormContainer, QueryButton, StepHeading } from './styles'; +import { toIMetricsBuilderQuery } from './utils'; + +const { TabPane } = Tabs; +function QuerySection({ + queryCategory, + setQueryCategory, + metricQueries, + setMetricQueries, + formulaQueries, + setFormulaQueries, + promQueries, + setPromQueries, +}: QuerySectionProps): JSX.Element { + // init namespace for translations + const { t } = useTranslation('rules'); + + const handleQueryCategoryChange = (s: string): void => { + if ( + parseInt(s, 10) === EQueryType.PROM && + (!promQueries || Object.keys(promQueries).length === 0) + ) { + setPromQueries({ + A: { + query: '', + stats: '', + name: 'A', + legend: '', + disabled: false, + }, + }); + } + + setQueryCategory(parseInt(s, 10)); + }; + + const getNextQueryLabel = useCallback((): string => { + let maxAscii = 0; + + Object.keys(metricQueries).forEach((key) => { + const n = key.charCodeAt(0); + if (n > maxAscii) { + maxAscii = n - 64; + } + }); + + return String.fromCharCode(64 + maxAscii + 1); + }, [metricQueries]); + + const handleFormulaChange = ({ + formulaIndex, + expression, + toggleDisable, + toggleDelete, + }: IQueryBuilderFormulaHandleChange): void => { + const allFormulas = formulaQueries; + const current = allFormulas[formulaIndex]; + if (expression) { + current.expression = expression; + } + + if (toggleDisable) { + current.disabled = !current.disabled; + } + + if (toggleDelete) { + delete allFormulas[formulaIndex]; + } else { + allFormulas[formulaIndex] = current; + } + + setFormulaQueries({ + ...allFormulas, + }); + }; + + const handleMetricQueryChange = ({ + queryIndex, + aggregateFunction, + metricName, + tagFilters, + groupBy, + legend, + toggleDisable, + toggleDelete, + }: IQueryBuilderQueryHandleChange): void => { + const allQueries = metricQueries; + const current = metricQueries[queryIndex]; + if (aggregateFunction) { + current.aggregateOperator = aggregateFunction; + } + if (metricName) { + current.metricName = metricName; + } + + if (tagFilters && current.tagFilters) { + current.tagFilters.items = tagFilters; + } + + if (legend) { + current.legend = legend; + } + + if (groupBy) { + current.groupBy = groupBy; + } + + if (toggleDisable) { + current.disabled = !current.disabled; + } + + if (toggleDelete) { + delete allQueries[queryIndex]; + } else { + allQueries[queryIndex] = current; + } + + setMetricQueries({ + ...allQueries, + }); + }; + + const addMetricQuery = useCallback(() => { + if (Object.keys(metricQueries).length > 5) { + notification.error({ + message: t('metric_query_max_limit'), + }); + return; + } + + const queryLabel = getNextQueryLabel(); + + const queries = metricQueries; + queries[queryLabel] = { + name: queryLabel, + queryName: queryLabel, + metricName: '', + formulaOnly: false, + aggregateOperator: EAggregateOperator.NOOP, + legend: '', + tagFilters: { + op: 'AND', + items: [], + }, + groupBy: [], + disabled: false, + expression: queryLabel, + }; + setMetricQueries({ ...queries }); + }, [t, getNextQueryLabel, metricQueries, setMetricQueries]); + + const addFormula = useCallback(() => { + // defaulting to F1 as only one formula is supported + // in alert definition + const queryLabel = 'F1'; + + const formulas = formulaQueries; + formulas[queryLabel] = { + queryName: queryLabel, + name: queryLabel, + formulaOnly: true, + expression: 'A', + disabled: false, + }; + + setFormulaQueries({ ...formulas }); + }, [formulaQueries, setFormulaQueries]); + + const renderPromqlUI = (): JSX.Element => { + return ( + + ); + }; + + const renderFormulaButton = (): JSX.Element => { + return ( + }> + {t('button_formula')} + + ); + }; + + const renderQueryButton = (): JSX.Element => { + return ( + }> + {t('button_query')} + + ); + }; + + const renderMetricUI = (): JSX.Element => { + return ( +
+ {metricQueries && + Object.keys(metricQueries).map((key: string) => { + // todo(amol): need to handle this in fetch + const current = metricQueries[key]; + current.name = key; + + return ( + + ); + })} + + {queryCategory !== EQueryType.PROM && renderQueryButton()} +
+ {formulaQueries && + Object.keys(formulaQueries).map((key: string) => { + // todo(amol): need to handle this in fetch + const current = formulaQueries[key]; + current.name = key; + + return ( + + ); + })} + {queryCategory === EQueryType.QUERY_BUILDER && + (!formulaQueries || Object.keys(formulaQueries).length === 0) && + metricQueries && + Object.keys(metricQueries).length > 0 && + renderFormulaButton()} +
+
+ ); + }; + return ( + <> + {t('alert_form_step1')} + +
+ + + + +
+ {queryCategory === EQueryType.PROM ? renderPromqlUI() : renderMetricUI()} +
+ + ); +} + +interface QuerySectionProps { + queryCategory: EQueryType; + setQueryCategory: (n: EQueryType) => void; + metricQueries: IMetricQueries; + setMetricQueries: (b: IMetricQueries) => void; + formulaQueries: IFormulaQueries; + setFormulaQueries: (b: IFormulaQueries) => void; + promQueries: IPromQueries; + setPromQueries: (p: IPromQueries) => void; +} + +export default QuerySection; diff --git a/frontend/src/container/FormAlertRules/RuleOptions.tsx b/frontend/src/container/FormAlertRules/RuleOptions.tsx new file mode 100644 index 0000000000..a4cc5844f4 --- /dev/null +++ b/frontend/src/container/FormAlertRules/RuleOptions.tsx @@ -0,0 +1,174 @@ +import { Select, Typography } from 'antd'; +import FormItem from 'antd/lib/form/FormItem'; +import React from 'react'; +import { useTranslation } from 'react-i18next'; +import { + AlertDef, + defaultCompareOp, + defaultEvalWindow, + defaultMatchType, +} from 'types/api/alerts/def'; +import { EQueryType } from 'types/common/dashboard'; + +import { + FormContainer, + InlineSelect, + StepHeading, + ThresholdInput, +} from './styles'; + +const { Option } = Select; + +function RuleOptions({ + alertDef, + setAlertDef, + queryCategory, +}: RuleOptionsProps): JSX.Element { + // init namespace for translations + const { t } = useTranslation('rules'); + + const handleMatchOptChange = (value: string | unknown): void => { + const m = (value as string) || alertDef.condition?.matchType; + setAlertDef({ + ...alertDef, + condition: { + ...alertDef.condition, + matchType: m, + }, + }); + }; + + const renderCompareOps = (): JSX.Element => { + return ( + { + const newOp = (value as string) || ''; + + setAlertDef({ + ...alertDef, + condition: { + ...alertDef.condition, + op: newOp, + }, + }); + }} + > + + + + + + ); + }; + + const renderThresholdMatchOpts = (): JSX.Element => { + return ( + handleMatchOptChange(value)} + > + + + + + + ); + }; + + const renderPromMatchOpts = (): JSX.Element => { + return ( + handleMatchOptChange(value)} + > + + + ); + }; + + const renderEvalWindows = (): JSX.Element => { + return ( + { + const ew = (value as string) || alertDef.evalWindow; + setAlertDef({ + ...alertDef, + evalWindow: ew, + }); + }} + > + {' '} + + + + + + + ); + }; + + const renderThresholdRuleOpts = (): JSX.Element => { + return ( + + + {t('text_condition1')} {renderCompareOps()} {t('text_condition2')}{' '} + {renderThresholdMatchOpts()} {t('text_condition3')} {renderEvalWindows()} + + + ); + }; + const renderPromRuleOptions = (): JSX.Element => { + return ( + + + {t('text_condition1')} {renderCompareOps()} {t('text_condition2')}{' '} + {renderPromMatchOpts()} + + + ); + }; + + return ( + <> + {t('alert_form_step2')} + + {queryCategory === EQueryType.PROM + ? renderPromRuleOptions() + : renderThresholdRuleOpts()} +
+ { + setAlertDef({ + ...alertDef, + condition: { + ...alertDef.condition, + target: (value as number) || undefined, + }, + }); + }} + /> +
+
+ + ); +} + +interface RuleOptionsProps { + alertDef: AlertDef; + setAlertDef: (a: AlertDef) => void; + queryCategory: EQueryType; +} +export default RuleOptions; diff --git a/frontend/src/container/FormAlertRules/index.tsx b/frontend/src/container/FormAlertRules/index.tsx new file mode 100644 index 0000000000..1a1615fe52 --- /dev/null +++ b/frontend/src/container/FormAlertRules/index.tsx @@ -0,0 +1,366 @@ +import { ExclamationCircleOutlined, SaveOutlined } from '@ant-design/icons'; +import { FormInstance, Modal, notification, Typography } from 'antd'; +import saveAlertApi from 'api/alerts/save'; +import ROUTES from 'constants/routes'; +import QueryTypeTag from 'container/NewWidget/LeftContainer/QueryTypeTag'; +import PlotTag from 'container/NewWidget/LeftContainer/WidgetGraph/PlotTag'; +import history from 'lib/history'; +import React, { useCallback, useEffect, useState } from 'react'; +import { useTranslation } from 'react-i18next'; +import { useQueryClient } from 'react-query'; +import { + IFormulaQueries, + IMetricQueries, + IPromQueries, +} from 'types/api/alerts/compositeQuery'; +import { + AlertDef, + defaultEvalWindow, + defaultMatchType, +} from 'types/api/alerts/def'; +import { Query as StagedQuery } from 'types/api/dashboard/getAll'; +import { EQueryType } from 'types/common/dashboard'; + +import BasicInfo from './BasicInfo'; +import ChartPreview from './ChartPreview'; +import QuerySection from './QuerySection'; +import RuleOptions from './RuleOptions'; +import { ActionButton, ButtonContainer, MainFormContainer } from './styles'; +import useDebounce from './useDebounce'; +import { + prepareBuilderQueries, + prepareStagedQuery, + toChartInterval, + toFormulaQueries, + toMetricQueries, +} from './utils'; + +function FormAlertRules({ + formInstance, + initialValue, + ruleId, +}: FormAlertRuleProps): JSX.Element { + // init namespace for translations + const { t } = useTranslation('rules'); + + // use query client + const ruleCache = useQueryClient(); + + const [loading, setLoading] = useState(false); + + // alertDef holds the form values to be posted + const [alertDef, setAlertDef] = useState(initialValue); + + // initQuery contains initial query when component was mounted + const initQuery = initialValue?.condition?.compositeMetricQuery; + + const [queryCategory, setQueryCategory] = useState( + initQuery?.queryType, + ); + + // local state to handle metric queries + const [metricQueries, setMetricQueries] = useState( + toMetricQueries(initQuery?.builderQueries), + ); + + // local state to handle formula queries + const [formulaQueries, setFormulaQueries] = useState( + toFormulaQueries(initQuery?.builderQueries), + ); + + // local state to handle promql queries + const [promQueries, setPromQueries] = useState({ + ...initQuery?.promQueries, + }); + + // staged query is used to display chart preview + const [stagedQuery, setStagedQuery] = useState(); + const debouncedStagedQuery = useDebounce(stagedQuery, 500); + + // this use effect initiates staged query and + // other queries based on server data. + // useful when fetching of initial values (from api) + // is delayed + useEffect(() => { + const initQuery = initialValue?.condition?.compositeMetricQuery; + const typ = initQuery?.queryType; + + // extract metric query from builderQueries + const mq = toMetricQueries(initQuery?.builderQueries); + + // extract formula query from builderQueries + const fq = toFormulaQueries(initQuery?.builderQueries); + + // prepare staged query + const sq = prepareStagedQuery(typ, mq, fq, initQuery?.promQueries); + const pq = initQuery?.promQueries; + + setQueryCategory(typ); + setMetricQueries(mq); + setFormulaQueries(fq); + setPromQueries(pq); + setStagedQuery(sq); + setAlertDef(initialValue); + }, [initialValue]); + + // this useEffect updates staging query when + // any of its sub-parameters changes + useEffect(() => { + // prepare staged query + const sq: StagedQuery = prepareStagedQuery( + queryCategory, + metricQueries, + formulaQueries, + promQueries, + ); + setStagedQuery(sq); + }, [queryCategory, metricQueries, formulaQueries, promQueries]); + + const onCancelHandler = useCallback(() => { + history.replace(ROUTES.LIST_ALL_ALERT); + }, []); + + // onQueryCategoryChange handles changes to query category + // in state as well as sets additional defaults + const onQueryCategoryChange = (val: EQueryType): void => { + setQueryCategory(val); + if (val === EQueryType.PROM) { + setAlertDef({ + ...alertDef, + condition: { + ...alertDef.condition, + matchType: defaultMatchType, + }, + evalWindow: defaultEvalWindow, + }); + } + }; + + const isFormValid = useCallback((): boolean => { + let retval = true; + + if (!alertDef.alert || alertDef.alert === '') { + notification.error({ + message: 'Error', + description: t('alertname_required'), + }); + return false; + } + + if ( + queryCategory === EQueryType.PROM && + (!promQueries || Object.keys(promQueries).length === 0) + ) { + notification.error({ + message: 'Error', + description: t('promql_required'), + }); + return false; + } + + if ( + (queryCategory === EQueryType.QUERY_BUILDER && !metricQueries) || + Object.keys(metricQueries).length === 0 + ) { + notification.error({ + message: 'Error', + description: t('condition_required'), + }); + return false; + } + + Object.keys(metricQueries).forEach((key) => { + if (metricQueries[key].metricName === '') { + retval = false; + notification.error({ + message: 'Error', + description: t('metricname_missing', { where: metricQueries[key].name }), + }); + } + }); + + Object.keys(formulaQueries).forEach((key) => { + if (formulaQueries[key].expression === '') { + retval = false; + notification.error({ + message: 'Error', + description: t('expression_missing', formulaQueries[key].name), + }); + } + }); + + return retval; + }, [t, alertDef, queryCategory, metricQueries, formulaQueries, promQueries]); + + const saveRule = useCallback(async () => { + if (!isFormValid()) { + return; + } + + const postableAlert: AlertDef = { + ...alertDef, + source: window?.location.toString(), + ruleType: + queryCategory === EQueryType.PROM ? 'promql_rule' : 'threshold_rule', + condition: { + ...alertDef.condition, + compositeMetricQuery: { + builderQueries: prepareBuilderQueries(metricQueries, formulaQueries), + promQueries, + queryType: queryCategory, + }, + }, + }; + + setLoading(true); + try { + const apiReq = + ruleId && ruleId > 0 + ? { data: postableAlert, id: ruleId } + : { data: postableAlert }; + + const response = await saveAlertApi(apiReq); + + if (response.statusCode === 200) { + notification.success({ + message: 'Success', + description: + !ruleId || ruleId === 0 ? t('rule_created') : t('rule_edited'), + }); + console.log('invalidting cache'); + // invalidate rule in cache + ruleCache.invalidateQueries(['ruleId', ruleId]); + + setTimeout(() => { + history.replace(ROUTES.LIST_ALL_ALERT); + }, 2000); + } else { + notification.error({ + message: 'Error', + description: response.error || t('unexpected_error'), + }); + } + } catch (e) { + console.log('save alert api failed:', e); + notification.error({ + message: 'Error', + description: t('unexpected_error'), + }); + } + setLoading(false); + }, [ + t, + isFormValid, + queryCategory, + ruleId, + alertDef, + metricQueries, + formulaQueries, + promQueries, + ruleCache, + ]); + + const onSaveHandler = useCallback(async () => { + const content = ( + + {' '} + {t('confirm_save_content_part1')} {' '} + {t('confirm_save_content_part2')} + + ); + Modal.confirm({ + icon: , + title: t('confirm_save_title'), + centered: true, + content, + onOk() { + saveRule(); + }, + }); + }, [t, saveRule, queryCategory]); + + const renderBasicInfo = (): JSX.Element => ( + + ); + + const renderQBChartPreview = (): JSX.Element => { + return ( + } + name="" + threshold={alertDef.condition?.target} + query={debouncedStagedQuery} + selectedInterval={toChartInterval(alertDef.evalWindow)} + /> + ); + }; + + const renderPromChartPreview = (): JSX.Element => { + return ( + } + name="Chart Preview" + threshold={alertDef.condition?.target} + query={debouncedStagedQuery} + /> + ); + }; + + return ( + <> + {Element} + + {queryCategory === EQueryType.QUERY_BUILDER && renderQBChartPreview()} + {queryCategory === EQueryType.PROM && renderPromChartPreview()} + + + + + {renderBasicInfo()} + + } + > + {ruleId > 0 ? t('button_savechanges') : t('button_createrule')} + + + {ruleId === 0 && t('button_cancelchanges')} + {ruleId > 0 && t('button_discard')} + + + + + ); +} + +interface FormAlertRuleProps { + formInstance: FormInstance; + initialValue: AlertDef; + ruleId: number; +} + +export default FormAlertRules; diff --git a/frontend/src/container/FormAlertRules/labels/Labels.machine.ts b/frontend/src/container/FormAlertRules/labels/Labels.machine.ts new file mode 100644 index 0000000000..812a498c65 --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/Labels.machine.ts @@ -0,0 +1,49 @@ +import { createMachine } from 'xstate'; + +export const ResourceAttributesFilterMachine = + /** @xstate-layout N4IgpgJg5mDOIC5QBECGsAWAjA9qgThAAQDKYBAxhkQIIB2xAYgJYA2ALmPgHQAqqUANJgAngGIAcgFEAGr0SgADjljN2zHHQUgAHogAcAFgAM3AOz6ATAEYAzJdsA2Y4cOWAnABoQIxAFpDR2tuQ319AFYTcKdbFycAX3jvNExcAmIySmp6JjZOHn4hUTFNACFWAFd8bWVVdU1tPQQzY1MXY2tDdzNHM3dHd0NvXwR7biMTa313S0i+63DE5PRsPEJScnwqWgYiFg4uPgFhcQAlKRIpeSQQWrUNLRumx3Czbg8TR0sbS31jfUcw38fW47gBHmm4XCVms3SWIBSq3SGyyO1yBx4AHlFFxUOwcPhJLJrkoVPcGk9ENYFuF3i5YR0wtEHECEAEgiEmV8zH1DLYzHZ4Yi0utMltsrt9vluNjcfjCWVKtUbnd6o9QE1rMYBtxbGFvsZ3NrZj1WdYOfotUZLX0XEFHEKViKMpttjk9nlDrL8HiCWJzpcSbcyWrGoh3NCQj0zK53P1ph1WeFLLqnJZ2s5vmZLA6kginWsXaj3VLDoUAGqoSpgEp0cpVGohh5hhDWDy0sz8zruakzamWVm-Qyg362V5-AZOayO1KFlHitEejFHKCV6v+i5XRt1ZuU1s52zjNOOaZfdOWIY+RDZ0Hc6ZmKEXqyLPPCudit2Sz08ACSEFYNbSHI27kuquiIOEjiONwjJgrM3RWJYZisgEIJgnYPTmuEdi2OaiR5nQOAQHA2hvsiH4Sui0qFCcIGhnuLSmP0YJuJ2xjJsmKELG8XZTK0tjdHG06vgW5GupRS7St6vrKqSO4UhqVL8TBWp8o4eqdl0A5Xmy3G6gK56-B4uERDOSKiuJi6lgUAhrhUYB0buimtrEKZBDYrxaS0OZca8+ltheybOI4hivGZzrzp+VGHH+AGOQp4EIHy+ghNYnawtG4TsbYvk8QKfHGAJfQ9uF76WSW37xWBTSGJ0qXpd0vRZdEKGPqC2YeO2-zfO4+HxEAA */ + createMachine({ + tsTypes: {} as import('./Labels.machine.typegen').Typegen0, + initial: 'Idle', + states: { + LabelKey: { + on: { + NEXT: { + actions: 'onSelectLabelValue', + target: 'LabelValue', + }, + onBlur: { + actions: 'onSelectLabelValue', + target: 'LabelValue', + }, + RESET: { + target: 'Idle', + }, + }, + }, + LabelValue: { + on: { + NEXT: { + actions: ['onValidateQuery'], + }, + onBlur: { + actions: ['onValidateQuery'], + // target: 'Idle', + }, + RESET: { + target: 'Idle', + }, + }, + }, + Idle: { + on: { + NEXT: { + actions: 'onSelectLabelKey', + description: 'Enter a label key', + target: 'LabelKey', + }, + }, + }, + }, + id: 'Label Key Values', + }); diff --git a/frontend/src/container/FormAlertRules/labels/Labels.machine.typegen.ts b/frontend/src/container/FormAlertRules/labels/Labels.machine.typegen.ts new file mode 100644 index 0000000000..f31469f659 --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/Labels.machine.typegen.ts @@ -0,0 +1,25 @@ +// This file was automatically generated. Edits will be overwritten + +export interface Typegen0 { + '@@xstate/typegen': true; + eventsCausingActions: { + onSelectLabelValue: 'NEXT' | 'onBlur'; + onValidateQuery: 'NEXT' | 'onBlur'; + onSelectLabelKey: 'NEXT'; + }; + internalEvents: { + 'xstate.init': { type: 'xstate.init' }; + }; + invokeSrcNameMap: {}; + missingImplementations: { + actions: 'onSelectLabelValue' | 'onValidateQuery' | 'onSelectLabelKey'; + services: never; + guards: never; + delays: never; + }; + eventsCausingServices: {}; + eventsCausingGuards: {}; + eventsCausingDelays: {}; + matchesStates: 'LabelKey' | 'LabelValue' | 'Idle'; + tags: never; +} diff --git a/frontend/src/container/FormAlertRules/labels/QueryChip.tsx b/frontend/src/container/FormAlertRules/labels/QueryChip.tsx new file mode 100644 index 0000000000..47e4c956ff --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/QueryChip.tsx @@ -0,0 +1,26 @@ +import React from 'react'; + +import { QueryChipContainer, QueryChipItem } from './styles'; +import { ILabelRecord } from './types'; + +interface QueryChipProps { + queryData: ILabelRecord; + onRemove: (id: string) => void; +} + +export default function QueryChip({ + queryData, + onRemove, +}: QueryChipProps): JSX.Element { + const { key, value } = queryData; + return ( + + onRemove(key)} + > + {key}: {value} + + + ); +} diff --git a/frontend/src/container/FormAlertRules/labels/index.tsx b/frontend/src/container/FormAlertRules/labels/index.tsx new file mode 100644 index 0000000000..1ce72d306c --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/index.tsx @@ -0,0 +1,164 @@ +import { + CloseCircleFilled, + ExclamationCircleOutlined, +} from '@ant-design/icons'; +import { useMachine } from '@xstate/react'; +import { Button, Input, message, Modal } from 'antd'; +import { map } from 'lodash-es'; +import React, { useCallback, useEffect, useState } from 'react'; +import { useTranslation } from 'react-i18next'; +import { useSelector } from 'react-redux'; +import { AppState } from 'store/reducers'; +import { Labels } from 'types/api/alerts/def'; +import AppReducer from 'types/reducer/app'; +import { v4 as uuid } from 'uuid'; + +import { ResourceAttributesFilterMachine } from './Labels.machine'; +import QueryChip from './QueryChip'; +import { QueryChipItem, SearchContainer } from './styles'; +import { ILabelRecord } from './types'; +import { createQuery, flattenLabels, prepareLabels } from './utils'; + +interface LabelSelectProps { + onSetLabels: (q: Labels) => void; + initialValues: Labels | undefined; +} + +function LabelSelect({ + onSetLabels, + initialValues, +}: LabelSelectProps): JSX.Element | null { + const { t } = useTranslation('rules'); + const { isDarkMode } = useSelector((state) => state.app); + const [currentVal, setCurrentVal] = useState(''); + const [staging, setStaging] = useState([]); + const [queries, setQueries] = useState( + initialValues ? flattenLabels(initialValues) : [], + ); + + const dispatchChanges = (updatedRecs: ILabelRecord[]): void => { + onSetLabels(prepareLabels(updatedRecs, initialValues)); + setQueries(updatedRecs); + }; + + const [state, send] = useMachine(ResourceAttributesFilterMachine, { + actions: { + onSelectLabelKey: () => {}, + onSelectLabelValue: () => { + if (currentVal !== '') { + setStaging((prevState) => [...prevState, currentVal]); + } else { + return; + } + setCurrentVal(''); + }, + onValidateQuery: (): void => { + if (currentVal === '') { + return; + } + + const generatedQuery = createQuery([...staging, currentVal]); + + if (generatedQuery) { + dispatchChanges([...queries, generatedQuery]); + setStaging([]); + setCurrentVal(''); + send('RESET'); + } + }, + }, + }); + + const handleFocus = (): void => { + if (state.value === 'Idle') { + send('NEXT'); + } + }; + + const handleBlur = useCallback((): void => { + if (staging.length === 1 && staging[0] !== undefined) { + send('onBlur'); + } + }, [send, staging]); + + useEffect(() => { + handleBlur(); + }, [handleBlur]); + + const handleChange = (e: React.ChangeEvent): void => { + setCurrentVal(e.target?.value); + }; + + const handleClose = (key: string): void => { + dispatchChanges(queries.filter((queryData) => queryData.key !== key)); + }; + + const handleClearAll = (): void => { + Modal.confirm({ + title: 'Confirm', + icon: , + content: t('remove_label_confirm'), + onOk() { + send('RESET'); + dispatchChanges([]); + setStaging([]); + message.success(t('remove_label_success')); + }, + okText: t('button_yes'), + cancelText: t('button_no'), + }); + }; + const renderPlaceholder = useCallback((): string => { + if (state.value === 'LabelKey') return 'Enter a label key then press ENTER.'; + if (state.value === 'LabelValue') + return `Enter a value for label key(${staging[0]}) then press ENTER.`; + return t('placeholder_label_key_pair'); + }, [t, state, staging]); + return ( + +
+ {queries.length > 0 && + map( + queries, + (query): JSX.Element => { + return ( + + ); + }, + )} +
+
+ {map(staging, (item) => { + return {item}; + })} +
+ +
+ { + if (e.key === 'Enter' || e.code === 'Enter') { + send('NEXT'); + } + }} + bordered={false} + value={currentVal as never} + style={{ flex: 1 }} + onFocus={handleFocus} + onBlur={handleBlur} + /> + + {queries.length || staging.length || currentVal ? ( +
+
+ ); +} + +export default LabelSelect; diff --git a/frontend/src/container/FormAlertRules/labels/styles.ts b/frontend/src/container/FormAlertRules/labels/styles.ts new file mode 100644 index 0000000000..04d6871315 --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/styles.ts @@ -0,0 +1,35 @@ +import { grey } from '@ant-design/colors'; +import { Tag } from 'antd'; +import styled from 'styled-components'; + +interface SearchContainerProps { + isDarkMode: boolean; + disabled: boolean; +} + +export const SearchContainer = styled.div` + width: 70%; + border-radisu: 4px; + background: ${({ isDarkMode }): string => (isDarkMode ? '#000' : '#fff')}; + flex: 1; + display: flex; + flex-direction: column; + padding: 0.2rem; + border: 1px solid #ccc5; + ${({ disabled }): string => (disabled ? `cursor: not-allowed;` : '')} +`; + +export const QueryChipContainer = styled.span` + display: flex; + align-items: center; + margin-right: 0.5rem; + &:hover { + & > * { + background: ${grey.primary}44; + } + } +`; + +export const QueryChipItem = styled(Tag)` + margin-right: 0.1rem; +`; diff --git a/frontend/src/container/FormAlertRules/labels/types.ts b/frontend/src/container/FormAlertRules/labels/types.ts new file mode 100644 index 0000000000..b10fc3fded --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/types.ts @@ -0,0 +1,9 @@ +export interface ILabelRecord { + key: string; + value: string; +} + +export interface IOption { + label: string; + value: string; +} diff --git a/frontend/src/container/FormAlertRules/labels/utils.ts b/frontend/src/container/FormAlertRules/labels/utils.ts new file mode 100644 index 0000000000..1a2943f3ee --- /dev/null +++ b/frontend/src/container/FormAlertRules/labels/utils.ts @@ -0,0 +1,54 @@ +import { Labels } from 'types/api/alerts/def'; + +import { ILabelRecord } from './types'; + +const hiddenLabels = ['severity', 'description']; + +export const createQuery = ( + selectedItems: Array = [], +): ILabelRecord | null => { + if (selectedItems.length === 2) { + return { + key: selectedItems[0] as string, + value: selectedItems[1] as string, + }; + } + return null; +}; + +export const flattenLabels = (labels: Labels): ILabelRecord[] => { + const recs: ILabelRecord[] = []; + + Object.keys(labels).forEach((key) => { + if (!hiddenLabels.includes(key)) { + recs.push({ + key, + value: labels[key], + }); + } + }); + + return recs; +}; + +export const prepareLabels = ( + recs: ILabelRecord[], + alertLabels: Labels | undefined, +): Labels => { + const labels: Labels = {}; + + recs.forEach((rec) => { + if (!hiddenLabels.includes(rec.key)) { + labels[rec.key] = rec.value; + } + }); + if (alertLabels) { + Object.keys(alertLabels).forEach((key) => { + if (hiddenLabels.includes(key)) { + labels[key] = alertLabels[key]; + } + }); + } + + return labels; +}; diff --git a/frontend/src/container/FormAlertRules/styles.ts b/frontend/src/container/FormAlertRules/styles.ts new file mode 100644 index 0000000000..1626becfa6 --- /dev/null +++ b/frontend/src/container/FormAlertRules/styles.ts @@ -0,0 +1,90 @@ +import { Button, Card, Form, Input, InputNumber, Select } from 'antd'; +import TextArea from 'antd/lib/input/TextArea'; +import styled from 'styled-components'; + +export const MainFormContainer = styled(Form)` + max-width: 900px; +`; + +export const ButtonContainer = styled.div` + &&& { + display: flex; + justify-content: flex-start; + align-items: center; + margin-top: 1rem; + margin-bottom: 3rem; + } +`; + +export const ActionButton = styled(Button)` + margin-right: 1rem; +`; + +export const QueryButton = styled(Button)` + &&& { + display: flex; + align-items: center; + margin-right: 1rem; + } +`; + +export const QueryContainer = styled(Card)` + &&& { + margin-top: 1rem; + min-height: 23.5%; + } +`; + +export const Container = styled.div` + margin-top: 1rem; + display: flex; + flex-direction: column; +`; + +export const StepHeading = styled.p` + margin-top: 1rem; + font-weight: bold; +`; + +export const InlineSelect = styled(Select)` + display: inline-block; + width: 10% !important; + margin-left: 0.2em; + margin-right: 0.2em; +`; + +export const SeveritySelect = styled(Select)` + width: 15% !important; +`; + +export const InputSmall = styled(Input)` + width: 40% !important; +`; + +export const FormContainer = styled.div` + padding: 2em; + margin-top: 1rem; + display: flex; + flex-direction: column; + background: #141414; + border-radius: 4px; + border: 1px solid #303030; +`; + +export const ThresholdInput = styled(InputNumber)` + & > div { + display: flex; + align-items: center; + & > .ant-input-number-group-addon { + width: 130px; + } + & > .ant-input-number { + width: 50%; + margin-left: 1em; + } + } +`; + +export const TextareaMedium = styled(TextArea)` + width: 70%; +`; diff --git a/frontend/src/container/FormAlertRules/useDebounce.js b/frontend/src/container/FormAlertRules/useDebounce.js new file mode 100644 index 0000000000..e430f55d63 --- /dev/null +++ b/frontend/src/container/FormAlertRules/useDebounce.js @@ -0,0 +1,31 @@ +/* eslint-disable */ +// @ts-ignore +// @ts-nocheck + +import { useEffect, useState } from 'react'; + +// see https://github.com/tannerlinsley/react-query/issues/293 +// see https://usehooks.com/useDebounce/ +export default function useDebounce(value, delay) { + // State and setters for debounced value + const [debouncedValue, setDebouncedValue] = useState(value); + + useEffect( + () => { + // Update debounced value after delay + const handler = setTimeout(() => { + setDebouncedValue(value); + }, delay); + + // Cancel the timeout if value changes (also on delay change or unmount) + // This is how we prevent debounced value from updating if value is changed ... + // .. within the delay period. Timeout gets cleared and restarted. + return () => { + clearTimeout(handler); + }; + }, + [value, delay] // Only re-call effect if value or delay changes + ); + + return debouncedValue; +} diff --git a/frontend/src/container/FormAlertRules/utils.ts b/frontend/src/container/FormAlertRules/utils.ts new file mode 100644 index 0000000000..c6a93d28bc --- /dev/null +++ b/frontend/src/container/FormAlertRules/utils.ts @@ -0,0 +1,134 @@ +import { Time } from 'container/TopNav/DateTimeSelection/config'; +import { + IBuilderQueries, + IFormulaQueries, + IFormulaQuery, + IMetricQueries, + IMetricQuery, + IPromQueries, + IPromQuery, +} from 'types/api/alerts/compositeQuery'; +import { + IMetricsBuilderQuery, + Query as IStagedQuery, +} from 'types/api/dashboard/getAll'; +import { EQueryType } from 'types/common/dashboard'; + +export const toFormulaQueries = (b: IBuilderQueries): IFormulaQueries => { + const f: IFormulaQueries = {}; + if (!b) return f; + Object.keys(b).forEach((key) => { + if (key === 'F1') { + f[key] = b[key] as IFormulaQuery; + } + }); + + return f; +}; + +export const toMetricQueries = (b: IBuilderQueries): IMetricQueries => { + const m: IMetricQueries = {}; + if (!b) return m; + Object.keys(b).forEach((key) => { + if (key !== 'F1') { + m[key] = b[key] as IMetricQuery; + } + }); + + return m; +}; + +export const toIMetricsBuilderQuery = ( + q: IMetricQuery, +): IMetricsBuilderQuery => { + return { + name: q.name, + metricName: q.metricName, + tagFilters: q.tagFilters, + groupBy: q.groupBy, + aggregateOperator: q.aggregateOperator, + disabled: q.disabled, + legend: q.legend, + }; +}; + +export const prepareBuilderQueries = ( + m: IMetricQueries, + f: IFormulaQueries, +): IBuilderQueries => { + if (!m) return {}; + const b: IBuilderQueries = { + ...m, + }; + + Object.keys(f).forEach((key) => { + b[key] = { + ...f[key], + aggregateOperator: undefined, + metricName: '', + }; + }); + return b; +}; + +export const prepareStagedQuery = ( + t: EQueryType, + m: IMetricQueries, + f: IFormulaQueries, + p: IPromQueries, +): IStagedQuery => { + const qbList: IMetricQuery[] = []; + const formulaList: IFormulaQuery[] = []; + const promList: IPromQuery[] = []; + + // convert map[string]IMetricQuery to IMetricQuery[] + if (m) { + Object.keys(m).forEach((key) => { + qbList.push(m[key]); + }); + } + + // convert map[string]IFormulaQuery to IFormulaQuery[] + if (f) { + Object.keys(f).forEach((key) => { + formulaList.push(f[key]); + }); + } + + // convert map[string]IPromQuery to IPromQuery[] + if (p) { + Object.keys(p).forEach((key) => { + promList.push({ ...p[key], name: key }); + }); + } + + return { + queryType: t, + promQL: promList, + metricsBuilder: { + formulas: formulaList, + queryBuilder: qbList, + }, + clickHouse: [], + }; +}; + +// toChartInterval converts eval window to chart selection time interval +export const toChartInterval = (evalWindow: string | undefined): Time => { + switch (evalWindow) { + case '5m0s': + return '5min'; + case '10m0s': + return '10min'; + case '15m0s': + return '15min'; + case '30m0s': + return '30min'; + case '60m0s': + return '30min'; + case '1440m0s': + return '1day'; + default: + return '5min'; + } +}; diff --git a/frontend/src/container/GridGraphComponent/index.tsx b/frontend/src/container/GridGraphComponent/index.tsx index d2139b1a08..3a1b84e963 100644 --- a/frontend/src/container/GridGraphComponent/index.tsx +++ b/frontend/src/container/GridGraphComponent/index.tsx @@ -1,6 +1,6 @@ import { Typography } from 'antd'; import { ChartData } from 'chart.js'; -import Graph, { GraphOnClickHandler } from 'components/Graph'; +import Graph, { GraphOnClickHandler, StaticLineProps } from 'components/Graph'; import { getYAxisFormattedValue } from 'components/Graph/yAxisConfig'; import ValueGraph from 'components/ValueGraph'; import { GRAPH_TYPES } from 'container/NewDashboard/ComponentsSlider'; @@ -18,6 +18,7 @@ function GridGraphComponent({ onClickHandler, name, yAxisUnit, + staticLine, }: GridGraphComponentProps): JSX.Element | null { const location = history.location.pathname; @@ -36,6 +37,7 @@ function GridGraphComponent({ onClickHandler, name, yAxisUnit, + staticLine, }} /> ); @@ -82,6 +84,7 @@ export interface GridGraphComponentProps { onClickHandler?: GraphOnClickHandler; name: string; yAxisUnit?: string; + staticLine?: StaticLineProps; } GridGraphComponent.defaultProps = { @@ -90,6 +93,7 @@ GridGraphComponent.defaultProps = { isStacked: undefined, onClickHandler: undefined, yAxisUnit: undefined, + staticLine: undefined, }; export default GridGraphComponent; diff --git a/frontend/src/container/ListAlertRules/ListAlert.tsx b/frontend/src/container/ListAlertRules/ListAlert.tsx index b851b0829a..4df6290725 100644 --- a/frontend/src/container/ListAlertRules/ListAlert.tsx +++ b/frontend/src/container/ListAlertRules/ListAlert.tsx @@ -64,9 +64,14 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element { }, { title: 'Alert Name', - dataIndex: 'name', + dataIndex: 'alert', key: 'name', sorter: (a, b): number => a.name.charCodeAt(0) - b.name.charCodeAt(0), + render: (value, record): JSX.Element => ( + onEditHandler(record.id.toString())}> + {value} + + ), }, { title: 'Severity', @@ -83,7 +88,7 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element { }, }, { - title: 'Tags', + title: 'Labels', dataIndex: 'labels', key: 'tags', align: 'center', @@ -100,7 +105,7 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element { {withOutSeverityKeys.map((e) => { return ( - {e} + {e}: {value[e]} ); })} diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/index.tsx b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/index.tsx index 4dee33c779..55adbd740b 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/index.tsx +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/index.tsx @@ -29,7 +29,7 @@ function PromQLQueryContainer({ toggleDelete, }: IPromQLQueryHandleChange): void => { const allQueries = queryData[WIDGET_PROMQL_QUERY_KEY_NAME]; - const currentIndexQuery = allQueries[queryIndex]; + const currentIndexQuery = allQueries[queryIndex as number]; if (query !== undefined) currentIndexQuery.query = query; if (legend !== undefined) currentIndexQuery.legend = legend; @@ -37,7 +37,7 @@ function PromQLQueryContainer({ currentIndexQuery.disabled = !currentIndexQuery.disabled; } if (toggleDelete) { - allQueries.splice(queryIndex, 1); + allQueries.splice(queryIndex as number, 1); } updateQueryData({ updatedQuery: { ...queryData } }); }; diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/query.tsx b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/query.tsx index 1a6dd2f9d2..6cffd55d8d 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/query.tsx +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/query.tsx @@ -7,7 +7,7 @@ import { IPromQLQueryHandleChange } from './types'; interface IPromQLQueryBuilderProps { queryData: IPromQLQuery; - queryIndex: number; + queryIndex: number | string; handleQueryChange: (args: IPromQLQueryHandleChange) => void; } diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/types.ts b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/types.ts index f1c88dd488..668a0c1f87 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/types.ts +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/promQL/types.ts @@ -1,7 +1,7 @@ import { IPromQLQuery } from 'types/api/dashboard/getAll'; export interface IPromQLQueryHandleChange { - queryIndex: number; + queryIndex: number | string; query?: IPromQLQuery['query']; legend?: IPromQLQuery['legend']; toggleDisable?: IPromQLQuery['disabled']; diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/formula.tsx b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/formula.tsx index 5be08f044e..02bc41198c 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/formula.tsx +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/formula.tsx @@ -9,7 +9,7 @@ const { TextArea } = Input; interface IMetricsBuilderFormulaProps { formulaData: IMetricsBuilderFormula; - formulaIndex: number; + formulaIndex: number | string; handleFormulaChange: (args: IQueryBuilderFormulaHandleChange) => void; } function MetricsBuilderFormula({ diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/index.tsx b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/index.tsx index 5b05eeca91..fdb6d4b7bc 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/index.tsx +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/index.tsx @@ -50,7 +50,7 @@ function QueryBuilderQueryContainer({ }: IQueryBuilderQueryHandleChange): void => { const allQueries = queryData[WIDGET_QUERY_BUILDER_QUERY_KEY_NAME].queryBuilder; - const currentIndexQuery = allQueries[queryIndex]; + const currentIndexQuery = allQueries[queryIndex as number]; if (aggregateFunction) { currentIndexQuery.aggregateOperator = aggregateFunction; } @@ -78,7 +78,7 @@ function QueryBuilderQueryContainer({ currentIndexQuery.disabled = !currentIndexQuery.disabled; } if (toggleDelete) { - allQueries.splice(queryIndex, 1); + allQueries.splice(queryIndex as number, 1); } updateQueryData({ updatedQuery: { ...queryData } }); }; @@ -92,7 +92,7 @@ function QueryBuilderQueryContainer({ queryData[WIDGET_QUERY_BUILDER_QUERY_KEY_NAME][ WIDGET_QUERY_BUILDER_FORMULA_KEY_NAME ]; - const currentIndexFormula = allFormulas[formulaIndex]; + const currentIndexFormula = allFormulas[formulaIndex as number]; if (expression) { currentIndexFormula.expression = expression; @@ -103,7 +103,7 @@ function QueryBuilderQueryContainer({ } if (toggleDelete) { - allFormulas.splice(formulaIndex, 1); + allFormulas.splice(formulaIndex as number, 1); } updateQueryData({ updatedQuery: { ...queryData } }); }; diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/query.tsx b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/query.tsx index fccf108b41..8f171baa3c 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/query.tsx +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/query.tsx @@ -15,7 +15,7 @@ import { IQueryBuilderQueryHandleChange } from './types'; const { Option } = Select; interface IMetricsBuilderProps { - queryIndex: number; + queryIndex: number | string; selectedGraph: GRAPH_TYPES; queryData: IMetricsBuilderQuery; handleQueryChange: (args: IQueryBuilderQueryHandleChange) => void; diff --git a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/types.ts b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/types.ts index 8d177cffd8..c577b8d123 100644 --- a/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/types.ts +++ b/frontend/src/container/NewWidget/LeftContainer/QuerySection/QueryBuilder/queryBuilder/types.ts @@ -4,7 +4,7 @@ import { } from 'types/api/dashboard/getAll'; export interface IQueryBuilderQueryHandleChange { - queryIndex: number; + queryIndex: number | string; aggregateFunction?: IMetricsBuilderQuery['aggregateOperator']; metricName?: IMetricsBuilderQuery['metricName']; tagFilters?: IMetricsBuilderQuery['tagFilters']['items']; @@ -16,7 +16,7 @@ export interface IQueryBuilderQueryHandleChange { } export interface IQueryBuilderFormulaHandleChange { - formulaIndex: number; + formulaIndex: number | string; expression?: IMetricsBuilderFormula['expression']; toggleDisable?: IMetricsBuilderFormula['disabled']; toggleDelete?: boolean; diff --git a/frontend/src/container/TopNav/DateTimeSelection/config.ts b/frontend/src/container/TopNav/DateTimeSelection/config.ts index 29d031e25b..427cb8786e 100644 --- a/frontend/src/container/TopNav/DateTimeSelection/config.ts +++ b/frontend/src/container/TopNav/DateTimeSelection/config.ts @@ -1,6 +1,7 @@ import ROUTES from 'constants/routes'; type FiveMin = '5min'; +type TenMin = '10min'; type FifteenMin = '15min'; type ThirtyMin = '30min'; type OneMin = '1min'; @@ -12,6 +13,7 @@ type Custom = 'custom'; export type Time = | FiveMin + | TenMin | FifteenMin | ThirtyMin | OneMin diff --git a/frontend/src/lib/getMinMax.ts b/frontend/src/lib/getMinMax.ts index 9c1fab94c3..cd6f26a496 100644 --- a/frontend/src/lib/getMinMax.ts +++ b/frontend/src/lib/getMinMax.ts @@ -13,6 +13,9 @@ const GetMinMax = ( if (interval === '1min') { const minTimeAgo = getMinAgo({ minutes: 1 }).getTime(); minTime = minTimeAgo; + } else if (interval === '10min') { + const minTimeAgo = getMinAgo({ minutes: 10 }).getTime(); + minTime = minTimeAgo; } else if (interval === '15min') { const minTimeAgo = getMinAgo({ minutes: 15 }).getTime(); minTime = minTimeAgo; diff --git a/frontend/src/pages/CreateAlert/index.tsx b/frontend/src/pages/CreateAlert/index.tsx index edfe543b1f..3bab0c1ee7 100644 --- a/frontend/src/pages/CreateAlert/index.tsx +++ b/frontend/src/pages/CreateAlert/index.tsx @@ -1,109 +1,9 @@ -import { SaveOutlined } from '@ant-design/icons'; -import { Button, notification } from 'antd'; -import createAlertsApi from 'api/alerts/create'; -import Editor from 'components/Editor'; -import ROUTES from 'constants/routes'; -import { State } from 'hooks/useFetch'; -import history from 'lib/history'; -import React, { useCallback, useState } from 'react'; -import { PayloadProps as CreateAlertPayloadProps } from 'types/api/alerts/create'; +import CreateAlertRule from 'container/CreateAlertRule'; +import React from 'react'; +import { alertDefaults } from 'types/api/alerts/create'; -import { ButtonContainer, Title } from './styles'; - -function CreateAlert(): JSX.Element { - const [value, setEditorValue] = useState( - `\n alert: High RPS\n expr: sum(rate(signoz_latency_count{span_kind="SPAN_KIND_SERVER"}[2m])) by (service_name) > 100\n for: 0m\n labels:\n severity: warning\n annotations:\n summary: High RPS of Applications\n description: "RPS is > 100\n\t\t\t VALUE = {{ $value }}\n\t\t\t LABELS = {{ $labels }}"\n `, - ); - - const [newAlertState, setNewAlertState] = useState< - State - >({ - error: false, - errorMessage: '', - loading: false, - payload: undefined, - success: false, - }); - const [notifications, Element] = notification.useNotification(); - - const defaultError = - 'Oops! Some issue occured in saving the alert please try again or contact support@signoz.io'; - - const onSaveHandler = useCallback(async () => { - try { - setNewAlertState((state) => ({ - ...state, - loading: true, - })); - - if (value.length === 0) { - setNewAlertState((state) => ({ - ...state, - loading: false, - })); - notifications.error({ - description: `Oops! We didn't catch that. Please make sure the alert settings are not empty or try again`, - message: 'Error', - }); - return; - } - - const response = await createAlertsApi({ - query: value, - }); - - if (response.statusCode === 200) { - setNewAlertState((state) => ({ - ...state, - loading: false, - payload: response.payload, - })); - notifications.success({ - message: 'Success', - description: 'Congrats. The alert was saved correctly.', - }); - - setTimeout(() => { - history.push(ROUTES.LIST_ALL_ALERT); - }, 3000); - } else { - notifications.error({ - description: response.error || defaultError, - message: 'Error', - }); - setNewAlertState((state) => ({ - ...state, - loading: false, - error: true, - errorMessage: response.error || defaultError, - })); - } - } catch (error) { - notifications.error({ - message: defaultError, - }); - } - }, [notifications, value]); - - return ( - <> - {Element} - - Create New Alert - setEditorValue(value)} value={value} /> - - - - - - ); +function CreateAlertPage(): JSX.Element { + return ; } -export default CreateAlert; +export default CreateAlertPage; diff --git a/frontend/src/pages/EditRules/index.tsx b/frontend/src/pages/EditRules/index.tsx index 09cda600ab..0217e40efc 100644 --- a/frontend/src/pages/EditRules/index.tsx +++ b/frontend/src/pages/EditRules/index.tsx @@ -47,7 +47,12 @@ function EditRules(): JSX.Element { return ; } - return ; + return ( + + ); } export default EditRules; diff --git a/frontend/src/types/api/alerts/compositeQuery.ts b/frontend/src/types/api/alerts/compositeQuery.ts new file mode 100644 index 0000000000..868eb712c4 --- /dev/null +++ b/frontend/src/types/api/alerts/compositeQuery.ts @@ -0,0 +1,64 @@ +import { + IMetricsBuilderFormula, + IMetricsBuilderQuery, + IPromQLQuery, + IQueryBuilderTagFilters, +} from 'types/api/dashboard/getAll'; +import { EAggregateOperator, EQueryType } from 'types/common/dashboard'; + +export interface ICompositeMetricQuery { + builderQueries: IBuilderQueries; + promQueries: IPromQueries; + queryType: EQueryType; +} + +export interface IPromQueries { + [key: string]: IPromQuery; +} + +export interface IPromQuery extends IPromQLQuery { + stats?: ''; +} + +export interface IBuilderQueries { + [key: string]: IBuilderQuery; +} + +// IBuilderQuery combines IMetricQuery and IFormulaQuery +// for api calls +export interface IBuilderQuery + extends Omit< + IMetricQuery, + 'aggregateOperator' | 'legend' | 'metricName' | 'tagFilters' + >, + Omit { + aggregateOperator: EAggregateOperator | undefined; + disabled: boolean; + name: string; + legend?: string; + metricName: string | null; + groupBy?: string[]; + expression?: string; + tagFilters?: IQueryBuilderTagFilters; + toggleDisable?: boolean; + toggleDelete?: boolean; +} + +export interface IFormulaQueries { + [key: string]: IFormulaQuery; +} + +export interface IFormulaQuery extends IMetricsBuilderFormula { + formulaOnly: boolean; + queryName: string; +} + +export interface IMetricQueries { + [key: string]: IMetricQuery; +} + +export interface IMetricQuery extends IMetricsBuilderQuery { + formulaOnly: boolean; + expression?: string; + queryName: string; +} diff --git a/frontend/src/types/api/alerts/create.ts b/frontend/src/types/api/alerts/create.ts index 6a2e5c09ab..6f179af79a 100644 --- a/frontend/src/types/api/alerts/create.ts +++ b/frontend/src/types/api/alerts/create.ts @@ -1,8 +1,48 @@ +import { AlertDef } from 'types/api/alerts/def'; + +import { defaultCompareOp, defaultEvalWindow, defaultMatchType } from './def'; + export interface Props { - query: string; + data: AlertDef; } export interface PayloadProps { status: string; data: string; } + +export const alertDefaults: AlertDef = { + condition: { + compositeMetricQuery: { + builderQueries: { + A: { + queryName: 'A', + name: 'A', + formulaOnly: false, + metricName: '', + tagFilters: { + op: 'AND', + items: [], + }, + groupBy: [], + aggregateOperator: 1, + expression: 'A', + disabled: false, + toggleDisable: false, + toggleDelete: false, + }, + }, + promQueries: {}, + queryType: 1, + }, + op: defaultCompareOp, + matchType: defaultMatchType, + }, + labels: { + severity: 'warning', + }, + annotations: { + description: 'A new alert', + }, + evalWindow: defaultEvalWindow, +}; diff --git a/frontend/src/types/api/alerts/def.ts b/frontend/src/types/api/alerts/def.ts new file mode 100644 index 0000000000..060bdc4d73 --- /dev/null +++ b/frontend/src/types/api/alerts/def.ts @@ -0,0 +1,32 @@ +import { ICompositeMetricQuery } from 'types/api/alerts/compositeQuery'; + +// default match type for threshold +export const defaultMatchType = '1'; + +// default eval window +export const defaultEvalWindow = '5m0s'; + +// default compare op: above +export const defaultCompareOp = '1'; + +export interface AlertDef { + id?: number; + alert?: string; + ruleType?: string; + condition: RuleCondition; + labels?: Labels; + annotations?: Labels; + evalWindow?: string; + source?: string; +} + +export interface RuleCondition { + compositeMetricQuery: ICompositeMetricQuery; + op?: string | undefined; + target?: number | undefined; + matchType?: string | undefined; +} + +export interface Labels { + [key: string]: string; +} diff --git a/frontend/src/types/api/alerts/get.ts b/frontend/src/types/api/alerts/get.ts index 52e9a78e7b..69eef474e1 100644 --- a/frontend/src/types/api/alerts/get.ts +++ b/frontend/src/types/api/alerts/get.ts @@ -1,9 +1,9 @@ -import { Alerts } from './getAll'; +import { AlertDef } from './def'; export interface Props { - id: Alerts['id']; + id: AlertDef['id']; } export type PayloadProps = { - data: string; + data: AlertDef; }; diff --git a/frontend/src/types/api/alerts/put.ts b/frontend/src/types/api/alerts/put.ts deleted file mode 100644 index e70de0b630..0000000000 --- a/frontend/src/types/api/alerts/put.ts +++ /dev/null @@ -1,9 +0,0 @@ -import { PayloadProps as DeletePayloadProps } from './delete'; -import { Alerts } from './getAll'; - -export type PayloadProps = DeletePayloadProps; - -export interface Props { - id: Alerts['id']; - data: DeletePayloadProps['data']; -} diff --git a/frontend/src/types/api/alerts/queryType.ts b/frontend/src/types/api/alerts/queryType.ts new file mode 100644 index 0000000000..277d6f0703 --- /dev/null +++ b/frontend/src/types/api/alerts/queryType.ts @@ -0,0 +1,17 @@ +export type QueryType = 1 | 2 | 3; + +export const QUERY_BUILDER: QueryType = 1; +export const PROMQL: QueryType = 3; + +export const resolveQueryCategoryName = (s: number): string => { + switch (s) { + case 1: + return 'Query Builder'; + case 2: + return 'Clickhouse Query'; + case 3: + return 'PromQL'; + default: + return ''; + } +}; diff --git a/frontend/src/types/api/alerts/save.ts b/frontend/src/types/api/alerts/save.ts new file mode 100644 index 0000000000..a815c728d2 --- /dev/null +++ b/frontend/src/types/api/alerts/save.ts @@ -0,0 +1,11 @@ +import { AlertDef } from './def'; + +export type PayloadProps = { + status: string; + data: string; +}; + +export interface Props { + id?: number; + data: AlertDef; +} From 3200248e986b6e1f8e05ceacf0656e06e0aac10e Mon Sep 17 00:00:00 2001 From: Palash Date: Thu, 14 Jul 2022 17:14:13 +0530 Subject: [PATCH 26/43] fix: error page is updated (#1394) --- frontend/src/container/AllError/index.tsx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/frontend/src/container/AllError/index.tsx b/frontend/src/container/AllError/index.tsx index 58b9c1201a..eef4fca88b 100644 --- a/frontend/src/container/AllError/index.tsx +++ b/frontend/src/container/AllError/index.tsx @@ -188,14 +188,14 @@ function AllErrors(): JSX.Element { sorter, ) => { if (!Array.isArray(sorter)) { - const { current = 0, pageSize = 0 } = paginations; + const { pageSize = 0, current = 0 } = paginations; const { columnKey = '', order } = sorter; const updatedOrder = order === 'ascend' ? 'ascending' : 'descending'; history.replace( `${pathname}?${createQueryParams({ order: updatedOrder, - offset: current - 1, + offset: (current - 1) * pageSize, orderParam: columnKey, pageSize, })}`, @@ -213,7 +213,7 @@ function AllErrors(): JSX.Element { pagination={{ pageSize: getUpdatedPageSize, responsive: true, - current: getUpdatedOffset + 1, + current: getUpdatedOffset / 10 + 1, position: ['bottomLeft'], total: errorCountResponse.data?.payload || 0, }} From a63267cf907e1c6e0c2dd4a9189913da473ac06b Mon Sep 17 00:00:00 2001 From: Amol Date: Thu, 14 Jul 2022 17:28:30 +0530 Subject: [PATCH 27/43] feat: remove global time selection on alerts pages --- frontend/src/container/TopNav/index.tsx | 3 +++ 1 file changed, 3 insertions(+) diff --git a/frontend/src/container/TopNav/index.tsx b/frontend/src/container/TopNav/index.tsx index ffd1b28175..ddf10023a3 100644 --- a/frontend/src/container/TopNav/index.tsx +++ b/frontend/src/container/TopNav/index.tsx @@ -19,6 +19,9 @@ const routesToSkip = [ ROUTES.ALL_DASHBOARD, ROUTES.ORG_SETTINGS, ROUTES.ERROR_DETAIL, + ROUTES.ALERTS_NEW, + ROUTES.EDIT_ALERTS, + ROUTES.LIST_ALL_ALERT, ]; function TopNav(): JSX.Element | null { From a2e1c41343b08e450adfc1c2291be28c1ff467d1 Mon Sep 17 00:00:00 2001 From: Amol Umbark Date: Thu, 14 Jul 2022 18:23:02 +0530 Subject: [PATCH 28/43] fix: edit form shows incorrect eval window when 24hours is saved (#1393) * fix: edit form shows incorrect eval window when 24hours is saved * fix: edit form shows incorrect eval window when 24hours is saved * feat: added 4 hour window to alert ui Co-authored-by: Palash --- frontend/public/locales/en-GB/rules.json | 1 + frontend/public/locales/en/rules.json | 1 + frontend/src/container/FormAlertRules/RuleOptions.tsx | 3 ++- frontend/src/container/FormAlertRules/utils.ts | 4 +++- frontend/src/container/TopNav/DateTimeSelection/config.ts | 2 ++ frontend/src/lib/getMinMax.ts | 5 +++-- 6 files changed, 12 insertions(+), 4 deletions(-) diff --git a/frontend/public/locales/en-GB/rules.json b/frontend/public/locales/en-GB/rules.json index 3e8ceb63cb..0602d7393c 100644 --- a/frontend/public/locales/en-GB/rules.json +++ b/frontend/public/locales/en-GB/rules.json @@ -33,6 +33,7 @@ "option_10min": "10 mins", "option_15min": "15 mins", "option_60min": "60 mins", + "option_4hours": "4 hours", "option_24hours": "24 hours", "field_threshold": "Alert Threshold", "option_allthetimes": "all the times", diff --git a/frontend/public/locales/en/rules.json b/frontend/public/locales/en/rules.json index 3e8ceb63cb..0602d7393c 100644 --- a/frontend/public/locales/en/rules.json +++ b/frontend/public/locales/en/rules.json @@ -33,6 +33,7 @@ "option_10min": "10 mins", "option_15min": "15 mins", "option_60min": "60 mins", + "option_4hours": "4 hours", "option_24hours": "24 hours", "field_threshold": "Alert Threshold", "option_allthetimes": "all the times", diff --git a/frontend/src/container/FormAlertRules/RuleOptions.tsx b/frontend/src/container/FormAlertRules/RuleOptions.tsx index a4cc5844f4..8bc87c483d 100644 --- a/frontend/src/container/FormAlertRules/RuleOptions.tsx +++ b/frontend/src/container/FormAlertRules/RuleOptions.tsx @@ -112,7 +112,8 @@ function RuleOptions({ - + + ); }; diff --git a/frontend/src/container/FormAlertRules/utils.ts b/frontend/src/container/FormAlertRules/utils.ts index c6a93d28bc..ea40ce692b 100644 --- a/frontend/src/container/FormAlertRules/utils.ts +++ b/frontend/src/container/FormAlertRules/utils.ts @@ -126,7 +126,9 @@ export const toChartInterval = (evalWindow: string | undefined): Time => { return '30min'; case '60m0s': return '30min'; - case '1440m0s': + case '4h0m0s': + return '4hr'; + case '24h0m0s': return '1day'; default: return '5min'; diff --git a/frontend/src/container/TopNav/DateTimeSelection/config.ts b/frontend/src/container/TopNav/DateTimeSelection/config.ts index 427cb8786e..69bdde40c7 100644 --- a/frontend/src/container/TopNav/DateTimeSelection/config.ts +++ b/frontend/src/container/TopNav/DateTimeSelection/config.ts @@ -7,6 +7,7 @@ type ThirtyMin = '30min'; type OneMin = '1min'; type SixHour = '6hr'; type OneHour = '1hr'; +type FourHour = '4hr'; type OneDay = '1day'; type OneWeek = '1week'; type Custom = 'custom'; @@ -17,6 +18,7 @@ export type Time = | FifteenMin | ThirtyMin | OneMin + | FourHour | SixHour | OneHour | Custom diff --git a/frontend/src/lib/getMinMax.ts b/frontend/src/lib/getMinMax.ts index cd6f26a496..ae830cc06a 100644 --- a/frontend/src/lib/getMinMax.ts +++ b/frontend/src/lib/getMinMax.ts @@ -36,8 +36,9 @@ const GetMinMax = ( // one week = one day * 7 const minTimeAgo = getMinAgo({ minutes: 26 * 60 * 7 }).getTime(); minTime = minTimeAgo; - } else if (interval === '6hr') { - const minTimeAgo = getMinAgo({ minutes: 6 * 60 }).getTime(); + } else if (['4hr', '6hr'].includes(interval)) { + const h = parseInt(interval.replace('hr', ''), 10); + const minTimeAgo = getMinAgo({ minutes: h * 60 }).getTime(); minTime = minTimeAgo; } else if (interval === 'custom') { maxTime = (dateTimeRange || [])[1] || 0; From 6fb7e34dbc7a5b6380f16c3792d81bada64bb193 Mon Sep 17 00:00:00 2001 From: Prashant Shahi Date: Thu, 14 Jul 2022 19:36:19 +0530 Subject: [PATCH 29/43] =?UTF-8?q?chore:=20=F0=9F=94=A7=20otel-collector=20?= =?UTF-8?q?config=20changes=20(#1388)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * chore: πŸ”§ otel-collector config changes * chore: πŸ—‘οΈ remove redundant users.xml * chore: πŸ”§ otel-config changes - seperate scraper job for otel-collector and otel-collector-metrcs internal metrics - use resourcedetection only for hostmetrics - add swarm service name and task name in resource attributes env Signed-off-by: Prashant Shahi --- .../clickhouse-setup/docker-compose.yaml | 21 ++- .../otel-collector-config.yaml | 59 +++++++-- .../otel-collector-metrics-config.yaml | 44 +++++-- .../clickhouse-setup/docker-compose.yaml | 21 ++- .../otel-collector-config.yaml | 58 +++++++-- .../otel-collector-metrics-config.yaml | 39 +++++- deploy/docker/clickhouse-setup/users.xml | 123 ------------------ .../tests/test-deploy/docker-compose.yaml | 2 + .../test-deploy/otel-collector-config.yaml | 71 +++++++--- .../otel-collector-metrics-config.yaml | 53 ++++++-- 10 files changed, 281 insertions(+), 210 deletions(-) delete mode 100644 deploy/docker/clickhouse-setup/users.xml diff --git a/deploy/docker-swarm/clickhouse-setup/docker-compose.yaml b/deploy/docker-swarm/clickhouse-setup/docker-compose.yaml index 2bd2a48bde..148f0aa77b 100644 --- a/deploy/docker-swarm/clickhouse-setup/docker-compose.yaml +++ b/deploy/docker-swarm/clickhouse-setup/docker-compose.yaml @@ -86,15 +86,19 @@ services: volumes: - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml ports: + # - "1777:1777" # pprof extension - "4317:4317" # OTLP gRPC receiver - "4318:4318" # OTLP HTTP receiver - # - "8889:8889" # Prometheus metrics exposed by the agent - # - "13133:13133" # health_check - # - "14268:14268" # Jaeger receiver + # - "8888:8888" # OtelCollector internal metrics + # - "8889:8889" # signoz spanmetrics exposed by the agent + # - "9411:9411" # Zipkin port + # - "13133:13133" # Health check extension + # - "14250:14250" # Jaeger gRPC + # - "14268:14268" # Jaeger thrift HTTP # - "55678:55678" # OpenCensus receiver - # - "55679:55679" # zpages extension - # - "55680:55680" # OTLP gRPC legacy receiver - # - "55681:55681" # OTLP HTTP legacy receiver + # - "55679:55679" # zPages extension + environment: + - OTEL_RESOURCE_ATTRIBUTES=host.name={{.Node.Hostname}},os.type={{.Node.Platform.OS}},dockerswarm.service.name={{.Service.Name}},dockerswarm.task.name={{.Task.Name}} deploy: mode: replicated replicas: 3 @@ -111,6 +115,11 @@ services: command: ["--config=/etc/otel-collector-metrics-config.yaml"] volumes: - ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml + # ports: + # - "1777:1777" # pprof extension + # - "8888:8888" # OtelCollector internal metrics + # - "13133:13133" # Health check extension + # - "55679:55679" # zPages extension deploy: restart_policy: condition: on-failure diff --git a/deploy/docker-swarm/clickhouse-setup/otel-collector-config.yaml b/deploy/docker-swarm/clickhouse-setup/otel-collector-config.yaml index a998d93ab9..61292c5781 100644 --- a/deploy/docker-swarm/clickhouse-setup/otel-collector-config.yaml +++ b/deploy/docker-swarm/clickhouse-setup/otel-collector-config.yaml @@ -1,30 +1,46 @@ receivers: + opencensus: + endpoint: 0.0.0.0:55678 otlp/spanmetrics: protocols: grpc: - endpoint: "localhost:12345" + endpoint: localhost:12345 otlp: protocols: grpc: + endpoint: 0.0.0.0:4317 http: + endpoint: 0.0.0.0:4318 jaeger: protocols: grpc: + endpoint: 0.0.0.0:14250 thrift_http: + endpoint: 0.0.0.0:14268 + # thrift_compact: + # endpoint: 0.0.0.0:6831 + # thrift_binary: + # endpoint: 0.0.0.0:6832 hostmetrics: collection_interval: 60s scrapers: - cpu: - load: - memory: - disk: - filesystem: - network: + cpu: {} + load: {} + memory: {} + disk: {} + filesystem: {} + network: {} + processors: batch: send_batch_size: 10000 send_batch_max_size: 11000 timeout: 10s + resourcedetection: + # Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels. + detectors: [env, system] # include ec2 for AWS, gce for GCP and azure for Azure. + timeout: 2s + override: false signozspanmetrics/prometheus: metrics_exporter: prometheus latency_histogram_buckets: [100us, 1ms, 2ms, 6ms, 10ms, 50ms, 100ms, 250ms, 500ms, 1000ms, 1400ms, 2000ms, 5s, 10s, 20s, 40s, 60s ] @@ -49,9 +65,7 @@ processors: # num_workers: 4 # queue_size: 100 # retry_on_failure: true -extensions: - health_check: {} - zpages: {} + exporters: clickhousetraces: datasource: tcp://clickhouse:9000/?database=signoz_traces @@ -60,18 +74,35 @@ exporters: resource_to_telemetry_conversion: enabled: true prometheus: - endpoint: "0.0.0.0:8889" + endpoint: 0.0.0.0:8889 + # logging: {} + +extensions: + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: [health_check, zpages, pprof] pipelines: traces: receivers: [jaeger, otlp] processors: [signozspanmetrics/prometheus, batch] exporters: [clickhousetraces] metrics: - receivers: [otlp, hostmetrics] + receivers: [otlp] processors: [batch] exporters: [clickhousemetricswrite] + metrics/hostmetrics: + receivers: [hostmetrics] + processors: [resourcedetection, batch] + exporters: [clickhousemetricswrite] metrics/spanmetrics: receivers: [otlp/spanmetrics] - exporters: [prometheus] \ No newline at end of file + exporters: [prometheus] diff --git a/deploy/docker-swarm/clickhouse-setup/otel-collector-metrics-config.yaml b/deploy/docker-swarm/clickhouse-setup/otel-collector-metrics-config.yaml index 3aa39b5f7e..a01f356437 100644 --- a/deploy/docker-swarm/clickhouse-setup/otel-collector-metrics-config.yaml +++ b/deploy/docker-swarm/clickhouse-setup/otel-collector-metrics-config.yaml @@ -1,17 +1,26 @@ receivers: - otlp: - protocols: - grpc: - http: - - # Data sources: metrics prometheus: config: scrape_configs: + # otel-collector internal metrics - job_name: "otel-collector" scrape_interval: 60s static_configs: - - targets: ["otel-collector:8889"] + - targets: + - otel-collector:8888 + # otel-collector-metrics internal metrics + - job_name: "otel-collector-metrics" + scrape_interval: 60s + static_configs: + - targets: + - localhost:8888 + # SigNoz span metrics + - job_name: "signozspanmetrics-collector" + scrape_interval: 60s + static_configs: + - targets: + - otel-collector:8889 + processors: batch: send_batch_size: 10000 @@ -32,17 +41,26 @@ processors: # num_workers: 4 # queue_size: 100 # retry_on_failure: true -extensions: - health_check: {} - zpages: {} + exporters: clickhousemetricswrite: endpoint: tcp://clickhouse:9000/?database=signoz_metrics +extensions: + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: [health_check, zpages, pprof] pipelines: metrics: - receivers: [otlp, prometheus] + receivers: [prometheus] processors: [batch] - exporters: [clickhousemetricswrite] \ No newline at end of file + exporters: [clickhousemetricswrite] diff --git a/deploy/docker/clickhouse-setup/docker-compose.yaml b/deploy/docker/clickhouse-setup/docker-compose.yaml index f8c2954446..5a47b6a461 100644 --- a/deploy/docker/clickhouse-setup/docker-compose.yaml +++ b/deploy/docker/clickhouse-setup/docker-compose.yaml @@ -82,16 +82,20 @@ services: command: ["--config=/etc/otel-collector-config.yaml"] volumes: - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml + environment: + - OTEL_RESOURCE_ATTRIBUTES=host.name=signoz-host,os.type=linux ports: + # - "1777:1777" # pprof extension - "4317:4317" # OTLP gRPC receiver - "4318:4318" # OTLP HTTP receiver - # - "8889:8889" # Prometheus metrics exposed by the agent - # - "13133:13133" # health_check - # - "14268:14268" # Jaeger receiver + # - "8888:8888" # OtelCollector internal metrics + # - "8889:8889" # signoz spanmetrics exposed by the agent + # - "9411:9411" # Zipkin port + # - "13133:13133" # health check extension + # - "14250:14250" # Jaeger gRPC + # - "14268:14268" # Jaeger thrift HTTP # - "55678:55678" # OpenCensus receiver - # - "55679:55679" # zpages extension - # - "55680:55680" # OTLP gRPC legacy receiver - # - "55681:55681" # OTLP HTTP legacy receiver + # - "55679:55679" # zPages extension mem_limit: 2000m restart: on-failure depends_on: @@ -103,6 +107,11 @@ services: command: ["--config=/etc/otel-collector-metrics-config.yaml"] volumes: - ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml + # ports: + # - "1777:1777" # pprof extension + # - "8888:8888" # OtelCollector internal metrics + # - "13133:13133" # Health check extension + # - "55679:55679" # zPages extension restart: on-failure depends_on: clickhouse: diff --git a/deploy/docker/clickhouse-setup/otel-collector-config.yaml b/deploy/docker/clickhouse-setup/otel-collector-config.yaml index e363f015df..0717cf4c45 100644 --- a/deploy/docker/clickhouse-setup/otel-collector-config.yaml +++ b/deploy/docker/clickhouse-setup/otel-collector-config.yaml @@ -1,25 +1,36 @@ receivers: + opencensus: + endpoint: 0.0.0.0:55678 otlp/spanmetrics: protocols: grpc: - endpoint: "localhost:12345" + endpoint: localhost:12345 otlp: protocols: grpc: + endpoint: 0.0.0.0:4317 http: + endpoint: 0.0.0.0:4318 jaeger: protocols: grpc: + endpoint: 0.0.0.0:14250 thrift_http: + endpoint: 0.0.0.0:14268 + # thrift_compact: + # endpoint: 0.0.0.0:6831 + # thrift_binary: + # endpoint: 0.0.0.0:6832 hostmetrics: collection_interval: 60s scrapers: - cpu: - load: - memory: - disk: - filesystem: - network: + cpu: {} + load: {} + memory: {} + disk: {} + filesystem: {} + network: {} + processors: batch: send_batch_size: 10000 @@ -49,9 +60,20 @@ processors: # num_workers: 4 # queue_size: 100 # retry_on_failure: true + resourcedetection: + # Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels. + detectors: [env, system] # include ec2 for AWS, gce for GCP and azure for Azure. + timeout: 2s + override: false + extensions: - health_check: {} - zpages: {} + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + exporters: clickhousetraces: datasource: tcp://clickhouse:9000/?database=signoz_traces @@ -60,18 +82,30 @@ exporters: resource_to_telemetry_conversion: enabled: true prometheus: - endpoint: "0.0.0.0:8889" + endpoint: 0.0.0.0:8889 + # logging: {} + service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: + - health_check + - zpages + - pprof pipelines: traces: receivers: [jaeger, otlp] processors: [signozspanmetrics/prometheus, batch] exporters: [clickhousetraces] metrics: - receivers: [otlp, hostmetrics] + receivers: [otlp] processors: [batch] exporters: [clickhousemetricswrite] + metrics/hostmetrics: + receivers: [hostmetrics] + processors: [resourcedetection, batch] + exporters: [clickhousemetricswrite] metrics/spanmetrics: receivers: [otlp/spanmetrics] exporters: [prometheus] diff --git a/deploy/docker/clickhouse-setup/otel-collector-metrics-config.yaml b/deploy/docker/clickhouse-setup/otel-collector-metrics-config.yaml index 26c629ba60..fdc5830f57 100644 --- a/deploy/docker/clickhouse-setup/otel-collector-metrics-config.yaml +++ b/deploy/docker/clickhouse-setup/otel-collector-metrics-config.yaml @@ -3,15 +3,28 @@ receivers: protocols: grpc: http: - - # Data sources: metrics prometheus: config: scrape_configs: + # otel-collector internal metrics - job_name: "otel-collector" scrape_interval: 60s static_configs: - - targets: ["otel-collector:8889"] + - targets: + - otel-collector:8888 + # otel-collector-metrics internal metrics + - job_name: "otel-collector-metrics" + scrape_interval: 60s + static_configs: + - targets: + - localhost:8888 + # SigNoz span metrics + - job_name: "signozspanmetrics-collector" + scrape_interval: 60s + static_configs: + - targets: + - otel-collector:8889 + processors: batch: send_batch_size: 10000 @@ -32,17 +45,29 @@ processors: # num_workers: 4 # queue_size: 100 # retry_on_failure: true + extensions: - health_check: {} - zpages: {} + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + exporters: clickhousemetricswrite: endpoint: tcp://clickhouse:9000/?database=signoz_metrics service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: + - health_check + - zpages + - pprof pipelines: metrics: - receivers: [otlp, prometheus] + receivers: [prometheus] processors: [batch] exporters: [clickhousemetricswrite] diff --git a/deploy/docker/clickhouse-setup/users.xml b/deploy/docker/clickhouse-setup/users.xml deleted file mode 100644 index f18562071d..0000000000 --- a/deploy/docker/clickhouse-setup/users.xml +++ /dev/null @@ -1,123 +0,0 @@ - - - - - - - - - - 10000000000 - - - random - - - - - 1 - - - - - - - - - - - - - ::/0 - - - - default - - - default - - - - - - - - - - - - - - 3600 - - - 0 - 0 - 0 - 0 - 0 - - - - diff --git a/pkg/query-service/tests/test-deploy/docker-compose.yaml b/pkg/query-service/tests/test-deploy/docker-compose.yaml index 6191c18fa0..9ef7cb1bfc 100644 --- a/pkg/query-service/tests/test-deploy/docker-compose.yaml +++ b/pkg/query-service/tests/test-deploy/docker-compose.yaml @@ -63,6 +63,8 @@ services: command: ["--config=/etc/otel-collector-config.yaml"] volumes: - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml + environment: + - OTEL_RESOURCE_ATTRIBUTES=host.name=signoz-host,os.type=linux ports: - "4317:4317" # OTLP GRPC receiver mem_limit: 2000m diff --git a/pkg/query-service/tests/test-deploy/otel-collector-config.yaml b/pkg/query-service/tests/test-deploy/otel-collector-config.yaml index d6c12ddcc1..b343350a34 100644 --- a/pkg/query-service/tests/test-deploy/otel-collector-config.yaml +++ b/pkg/query-service/tests/test-deploy/otel-collector-config.yaml @@ -1,28 +1,40 @@ receivers: + opencensus: + endpoint: 0.0.0.0:55678 otlp/spanmetrics: protocols: grpc: - endpoint: "localhost:12345" + endpoint: localhost:12345 otlp: protocols: grpc: + endpoint: 0.0.0.0:4317 http: + endpoint: 0.0.0.0:4318 jaeger: protocols: grpc: + endpoint: 0.0.0.0:14250 thrift_http: + endpoint: 0.0.0.0:14268 + # thrift_compact: + # endpoint: 0.0.0.0:6831 + # thrift_binary: + # endpoint: 0.0.0.0:6832 hostmetrics: - collection_interval: 30s + collection_interval: 60s scrapers: - cpu: - load: - memory: - disk: - filesystem: - network: + cpu: {} + load: {} + memory: {} + disk: {} + filesystem: {} + network: {} + processors: batch: - send_batch_size: 1000 + send_batch_size: 10000 + send_batch_max_size: 11000 timeout: 10s signozspanmetrics/prometheus: metrics_exporter: prometheus @@ -34,20 +46,33 @@ processors: - name: deployment.environment default: default # memory_limiter: - # # Same as --mem-ballast-size-mib CLI argument - # ballast_size_mib: 683 # # 80% of maximum memory up to 2G # limit_mib: 1500 # # 25% of limit up to 2G # spike_limit_mib: 512 # check_interval: 5s + # + # # 50% of the maximum memory + # limit_percentage: 50 + # # 20% of max memory usage spike expected + # spike_limit_percentage: 20 # queued_retry: # num_workers: 4 # queue_size: 100 # retry_on_failure: true + resourcedetection: + detectors: [env, system] + timeout: 2s + override: false + extensions: - health_check: {} - zpages: {} + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + exporters: clickhousetraces: datasource: tcp://clickhouse:9000/?database=signoz_traces @@ -56,18 +81,30 @@ exporters: resource_to_telemetry_conversion: enabled: true prometheus: - endpoint: "0.0.0.0:8889" + endpoint: 0.0.0.0:8889 + # logging: {} + service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: + - health_check + - zpages + - pprof pipelines: traces: receivers: [jaeger, otlp] processors: [signozspanmetrics/prometheus, batch] exporters: [clickhousetraces] metrics: - receivers: [otlp, hostmetrics] + receivers: [otlp] processors: [batch] exporters: [clickhousemetricswrite] + metrics/hostmetrics: + receivers: [hostmetrics] + processors: [resourcedetection, batch] + exporters: [clickhousemetricswrite] metrics/spanmetrics: receivers: [otlp/spanmetrics] - exporters: [prometheus] \ No newline at end of file + exporters: [prometheus] diff --git a/pkg/query-service/tests/test-deploy/otel-collector-metrics-config.yaml b/pkg/query-service/tests/test-deploy/otel-collector-metrics-config.yaml index 3af039268c..fdc5830f57 100644 --- a/pkg/query-service/tests/test-deploy/otel-collector-metrics-config.yaml +++ b/pkg/query-service/tests/test-deploy/otel-collector-metrics-config.yaml @@ -3,42 +3,71 @@ receivers: protocols: grpc: http: - - # Data sources: metrics prometheus: config: scrape_configs: + # otel-collector internal metrics - job_name: "otel-collector" - scrape_interval: 30s + scrape_interval: 60s static_configs: - - targets: ["otel-collector:8889"] + - targets: + - otel-collector:8888 + # otel-collector-metrics internal metrics + - job_name: "otel-collector-metrics" + scrape_interval: 60s + static_configs: + - targets: + - localhost:8888 + # SigNoz span metrics + - job_name: "signozspanmetrics-collector" + scrape_interval: 60s + static_configs: + - targets: + - otel-collector:8889 + processors: batch: - send_batch_size: 1000 + send_batch_size: 10000 + send_batch_max_size: 11000 timeout: 10s # memory_limiter: - # # Same as --mem-ballast-size-mib CLI argument - # ballast_size_mib: 683 # # 80% of maximum memory up to 2G # limit_mib: 1500 # # 25% of limit up to 2G # spike_limit_mib: 512 # check_interval: 5s + # + # # 50% of the maximum memory + # limit_percentage: 50 + # # 20% of max memory usage spike expected + # spike_limit_percentage: 20 # queued_retry: # num_workers: 4 # queue_size: 100 # retry_on_failure: true + extensions: - health_check: {} - zpages: {} + health_check: + endpoint: 0.0.0.0:13133 + zpages: + endpoint: 0.0.0.0:55679 + pprof: + endpoint: 0.0.0.0:1777 + exporters: clickhousemetricswrite: endpoint: tcp://clickhouse:9000/?database=signoz_metrics service: - extensions: [health_check, zpages] + telemetry: + metrics: + address: 0.0.0.0:8888 + extensions: + - health_check + - zpages + - pprof pipelines: metrics: - receivers: [otlp, prometheus] + receivers: [prometheus] processors: [batch] - exporters: [clickhousemetricswrite] \ No newline at end of file + exporters: [clickhousemetricswrite] From bebfaa1c4c0f479af1261579848325f0c715914a Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Thu, 14 Jul 2022 22:41:11 +0530 Subject: [PATCH 30/43] Update CONTRIBUTING.md --- CONTRIBUTING.md | 91 +++++++++++++++++++++++++++++++++++++------------ 1 file changed, 69 insertions(+), 22 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 86d4010def..4bb6509924 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -6,6 +6,12 @@ Hi there! We're thrilled that you'd like to contribute to this project, thank yo Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution. +- We accept contributions made to the [SigNoz `develop` branch]() +- Find all SigNoz Docker Hub images here + - [signoz/frontend](https://hub.docker.com/r/signoz/frontend) + - [signoz/query-service](https://hub.docker.com/r/signoz/query-service) + - [signoz/otelcontribcol](https://hub.docker.com/r/signoz/otelcontribcol) + ## Finding contributions to work on πŸ’¬ Looking at the existing issues is a great way to find something to contribute on. @@ -24,6 +30,7 @@ Also, have a look at these [good first issues labels](https://github.com/SigNoz/ - [To run ClickHouse setup](#41-to-run-clickhouse-setup-recommended-for-local-development) - [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart-) - [To run helm chart for local development](#51-to-run-helm-chart-for-local-development) +- [Other Ways to Contribute](#other-ways-to-contribute) # 1. General Instructions πŸ“ @@ -135,26 +142,43 @@ Depending upon your area of expertise & interest, you can choose one or more to **Need to Update: [https://github.com/SigNoz/signoz/tree/develop/frontend](https://github.com/SigNoz/signoz/tree/develop/frontend)** -### 3.1 Contribute to Frontend with Docker installation of SigNoz +Also, have a look at [Frontend README.md](https://github.com/SigNoz/signoz/blob/develop/frontend/README.md) sections for more info on how to setup SigNoz frontend locally (with and without Docker). + +## 3.1 Contribute to Frontend with Docker installation of SigNoz - Clone the SigNoz repository and cd into signoz directory, ``` git clone https://github.com/SigNoz/signoz.git && cd signoz ``` - Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) + +![develop-frontend](https://user-images.githubusercontent.com/52788043/179009217-6692616b-17dc-4d27-b587-9d007098d739.jpeg) + + - run `cd deploy` to move to deploy directory, - Install signoz locally **without** the frontend, - Add / Uncomment the below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L47) - ```docker + ``` ports: - "8080:8080" ``` +query service - Next run, ``` sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d ``` -- `cd ../frontend` and change baseURL to `http://localhost:8080` in file [`src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) +- `cd ../frontend` and change baseURL in file [`frontend/src/constants/env.ts#L2`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts#L2) and for that, you need to create a `.env` file in the `frontend` directory with the following environment variable (`FRONTEND_API_ENDPOINT`) matching your configuration. + + If you have backend api exposed via frontend nginx: + ``` + FRONTEND_API_ENDPOINT=http://localhost:3301 + ``` + If not: + ``` + FRONTEND_API_ENDPOINT=http://localhost:8080 + ``` + - Next, ``` yarn install @@ -166,7 +190,7 @@ The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Que **[`^top^`](#)** -### 3.2 Contribute to Frontend without installing SigNoz backend +## 3.2 Contribute to Frontend without installing SigNoz backend If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend. @@ -193,27 +217,27 @@ Please ping us in the [`#contributing`](https://signoz-community.slack.com/archi **Need to Update:** [**https://github.com/SigNoz/signoz/tree/develop/pkg/query-service**](https://github.com/SigNoz/signoz/tree/develop/pkg/query-service) -### 4.1 To run ClickHouse setup (recommended for local development) +## 4.1 To run ClickHouse setup (recommended for local development) - Clone the SigNoz repository and cd into signoz directory, -``` -git clone https://github.com/SigNoz/signoz.git && cd signoz -``` + ``` + git clone https://github.com/SigNoz/signoz.git && cd signoz + ``` - run `sudo make dev-setup` to configure local setup to run query-service, - Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) - Comment out `query-service` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L41`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L41) - add below configuration to `clickhouse` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) ``` - expose: - - 9000 ports: - 9001:9000 ``` - run `cd pkg/query-service/` to move to `query-service` directory, -- Open [`./constants/constants.go#L38`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go#L38) - - Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \ - with β†’ ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` +- Then, you need to create a `.env` file with the following environment variable + ``` + SIGNOZ_LOCAL_DB_PATH="./signoz.db" + ``` +to set your local environment with the right `RELATIONAL_DATASOURCE_PATH` as mentioned in [`./constants/constants.go#L38`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go#L38) - Now, install SigNoz locally **without** the `frontend` and `query-service`, - If you are using `x86_64` processors (All Intel/AMD processors) run `sudo make run-x86` @@ -223,6 +247,29 @@ git clone https://github.com/SigNoz/signoz.git && cd signoz ``` ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go ``` + +#### Build and Run locally +``` +cd pkg/query-service +go build -o build/query-service main.go +ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service +``` + +#### Docker Images +The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service + +``` +docker pull signoz/query-service +``` + +``` +docker pull signoz/query-service:latest +``` + +``` +docker pull signoz/query-service:develop +``` + ### Important Note: The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh) @@ -248,12 +295,12 @@ Click the button below. A workspace with all required environments will be creat **Need to Update: [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts).** -### 5.1 To run helm chart for local development +## 5.1 To run helm chart for local development - Clone the SigNoz repository and cd into charts directory, -``` -git clone https://github.com/SigNoz/charts.git && cd charts -``` + ``` + git clone https://github.com/SigNoz/charts.git && cd charts + ``` - It is recommended to use lightweight kubernetes (k8s) cluster for local development: - [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation) - [k3d](https://k3d.io/#installation) @@ -261,9 +308,9 @@ git clone https://github.com/SigNoz/charts.git && cd charts - create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster, - run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace, - next run, -``` -kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301 -``` + ``` + kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301 + ``` to make SigNoz UI available at [localhost:3301](http://localhost:3301) **5.1.1 To install the HotROD sample app:** @@ -300,7 +347,7 @@ curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-del --- -## Other ways to contribute +## Other Ways to Contribute There are many other ways to get involved with the community and to participate in this project: @@ -315,6 +362,6 @@ There are many other ways to get involved with the community and to participate By contributing to SigNoz, you agree that your contributions will be licensed under its MIT license. -Again, feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :) +Again, Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :) Thank You! From 4ad79bee1855909d12c4817007d1052614e13a9b Mon Sep 17 00:00:00 2001 From: Priyansh Khodiyar Date: Thu, 14 Jul 2022 22:51:51 +0530 Subject: [PATCH 31/43] add images --- CONTRIBUTING.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 4bb6509924..cff1047baf 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -225,12 +225,17 @@ Please ping us in the [`#contributing`](https://signoz-community.slack.com/archi ``` - run `sudo make dev-setup` to configure local setup to run query-service, - Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68) +develop-frontend + - Comment out `query-service` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L41`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L41) +Screenshot 2022-07-14 at 22 48 07 + - add below configuration to `clickhouse` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml) ``` ports: - 9001:9000 ``` +Screenshot 2022-07-14 at 22 50 37 - run `cd pkg/query-service/` to move to `query-service` directory, - Then, you need to create a `.env` file with the following environment variable From c5c7fb238f7016214bc99bcc464f7f8a91625ffe Mon Sep 17 00:00:00 2001 From: Srikanth Chekuri Date: Fri, 15 Jul 2022 09:55:43 +0530 Subject: [PATCH 32/43] fix: update the error rate percentage text and scale (#1399) --- frontend/src/container/MetricsTable/index.tsx | 2 +- pkg/query-service/app/clickhouseReader/reader.go | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/frontend/src/container/MetricsTable/index.tsx b/frontend/src/container/MetricsTable/index.tsx index ff700da83a..cc0778c80e 100644 --- a/frontend/src/container/MetricsTable/index.tsx +++ b/frontend/src/container/MetricsTable/index.tsx @@ -56,7 +56,7 @@ function Metrics(): JSX.Element { render: (value: number): string => (value / 1000000).toFixed(2), }, { - title: 'Error Rate (in %)', + title: 'Error Rate (% of requests)', dataIndex: 'errorRate', key: 'errorRate', sorter: (a: DataProps, b: DataProps): number => a.errorRate - b.errorRate, diff --git a/pkg/query-service/app/clickhouseReader/reader.go b/pkg/query-service/app/clickhouseReader/reader.go index d22d26fc0e..6716373c8b 100644 --- a/pkg/query-service/app/clickhouseReader/reader.go +++ b/pkg/query-service/app/clickhouseReader/reader.go @@ -734,8 +734,8 @@ func (r *ClickHouseReader) GetServices(ctx context.Context, queryParams *model.G serviceItems[i].Num4XX = val } serviceItems[i].CallRate = float64(serviceItems[i].NumCalls) / float64(queryParams.Period) - serviceItems[i].FourXXRate = float64(serviceItems[i].Num4XX) / float64(serviceItems[i].NumCalls) - serviceItems[i].ErrorRate = float64(serviceItems[i].NumErrors) / float64(serviceItems[i].NumCalls) + serviceItems[i].FourXXRate = float64(serviceItems[i].Num4XX) * 100 / float64(serviceItems[i].NumCalls) + serviceItems[i].ErrorRate = float64(serviceItems[i].NumErrors) * 100 / float64(serviceItems[i].NumCalls) } return &serviceItems, nil From c90e9ffa3401142e8311bf5036443f59f6ba809e Mon Sep 17 00:00:00 2001 From: Vishal Sharma Date: Fri, 15 Jul 2022 12:35:15 +0530 Subject: [PATCH 33/43] fix: remove requirement of exceptionType and serviceName from errorDetail page URL (#1400) * fix: remove requirement of exceptionType and serviceName from errorDetail page URL * chore: id is updated * chore: commented code is removed * chore: eslint error is fixed Co-authored-by: Palash --- frontend/src/container/AllError/index.tsx | 4 +- frontend/src/container/ErrorDetails/index.tsx | 8 ++-- frontend/src/pages/ErrorDetails/index.tsx | 37 +++++-------------- 3 files changed, 13 insertions(+), 36 deletions(-) diff --git a/frontend/src/container/AllError/index.tsx b/frontend/src/container/AllError/index.tsx index eef4fca88b..253af7dfe1 100644 --- a/frontend/src/container/AllError/index.tsx +++ b/frontend/src/container/AllError/index.tsx @@ -101,9 +101,7 @@ function AllErrors(): JSX.Element { render: (value, record): JSX.Element => ( value}> diff --git a/frontend/src/container/ErrorDetails/index.tsx b/frontend/src/container/ErrorDetails/index.tsx index ea8a3c2e3e..a200744890 100644 --- a/frontend/src/container/ErrorDetails/index.tsx +++ b/frontend/src/container/ErrorDetails/index.tsx @@ -90,11 +90,9 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element { } history.replace( - `${history.location.pathname}?${urlKey.serviceName}=${serviceName}&${ - urlKey.exceptionType - }=${errorType}&groupId=${idPayload.groupID}×tamp=${getNanoSeconds( - timespamp, - )}&errorId=${id}`, + `${history.location.pathname}?&groupId=${ + idPayload.groupID + }×tamp=${getNanoSeconds(timespamp)}&errorId=${id}`, ); } catch (error) { notification.error({ diff --git a/frontend/src/pages/ErrorDetails/index.tsx b/frontend/src/pages/ErrorDetails/index.tsx index 25bffe874d..348391b741 100644 --- a/frontend/src/pages/ErrorDetails/index.tsx +++ b/frontend/src/pages/ErrorDetails/index.tsx @@ -23,8 +23,6 @@ function ErrorDetails(): JSX.Element { const { search } = useLocation(); const params = useMemo(() => new URLSearchParams(search), [search]); - const serviceName = params.get(urlKey.serviceName); - const expectionType = params.get(urlKey.exceptionType); const groupId = params.get(urlKey.groupId); const errorId = params.get(urlKey.errorId); const timestamp = params.get(urlKey.timestamp); @@ -50,34 +48,17 @@ function ErrorDetails(): JSX.Element { }, ); - const { data, status } = useQuery( - [ - 'expectionType', - expectionType, - 'serviceName', - serviceName, - maxTime, - minTime, - groupId, - ], - { - queryFn: () => - getByErrorType({ - groupID: groupId || '', - timestamp: timestamp || '', - }), - enabled: - !!expectionType && !!serviceName && !!groupId && IdStatus !== 'success', - }, - ); + const { data, status } = useQuery([maxTime, minTime, groupId], { + queryFn: () => + getByErrorType({ + groupID: groupId || '', + timestamp: timestamp || '', + }), + enabled: !!groupId && IdStatus !== 'success', + }); // if errorType and serviceName is null redirecting to the ALL_ERROR page not now - if ( - serviceName === null || - expectionType === null || - groupId === null || - timestamp === null - ) { + if (groupId === null || timestamp === null) { return ; } From b6a6833a642d209002bde11099243cd2b29b4861 Mon Sep 17 00:00:00 2001 From: Palash Date: Fri, 15 Jul 2022 12:46:57 +0530 Subject: [PATCH 34/43] test: utils unit case is updated (#1396) --- frontend/src/container/AllError/utils.test.ts | 83 ++++++++++++++++++- 1 file changed, 82 insertions(+), 1 deletion(-) diff --git a/frontend/src/container/AllError/utils.test.ts b/frontend/src/container/AllError/utils.test.ts index b0d302f01b..344d318ebf 100644 --- a/frontend/src/container/AllError/utils.test.ts +++ b/frontend/src/container/AllError/utils.test.ts @@ -1,4 +1,15 @@ -import { isOrder, isOrderParams } from './utils'; +import { Order, OrderBy } from 'types/api/errors/getAll'; + +import { + getDefaultOrder, + getLimit, + getOffSet, + getOrder, + getOrderParams, + getUpdatePageSize, + isOrder, + isOrderParams, +} from './utils'; describe('Error utils', () => { test('Valid OrderBy Params', () => { @@ -25,4 +36,74 @@ describe('Error utils', () => { expect(isOrder(null)).toBe(false); expect(isOrder('')).toBe(false); }); + + test('Default Order', () => { + const OrderBy: OrderBy[] = [ + 'exceptionCount', + 'exceptionType', + 'firstSeen', + 'lastSeen', + 'serviceName', + ]; + + const order: Order[] = ['ascending', 'descending']; + + const ascOrd = order[0]; + const desOrd = order[1]; + + OrderBy.forEach((order) => { + expect(getDefaultOrder(order, ascOrd, order)).toBe('ascend'); + expect(getDefaultOrder(order, desOrd, order)).toBe('descend'); + }); + }); + + test('Limit', () => { + expect(getLimit(null)).toBe(10); + expect(getLimit('')).toBe(10); + expect(getLimit('0')).toBe(0); + expect(getLimit('1')).toBe(1); + expect(getLimit('10')).toBe(10); + expect(getLimit('11')).toBe(11); + expect(getLimit('100')).toBe(100); + expect(getLimit('101')).toBe(101); + }); + + test('Update Page Size', () => { + expect(getUpdatePageSize(null)).toBe(10); + expect(getUpdatePageSize('')).toBe(10); + expect(getUpdatePageSize('0')).toBe(0); + expect(getUpdatePageSize('1')).toBe(1); + expect(getUpdatePageSize('10')).toBe(10); + expect(getUpdatePageSize('11')).toBe(11); + expect(getUpdatePageSize('100')).toBe(100); + expect(getUpdatePageSize('101')).toBe(101); + }); + + test('Order Params', () => { + expect(getOrderParams(null)).toBe('serviceName'); + expect(getOrderParams('')).toBe('serviceName'); + expect(getOrderParams('serviceName')).toBe('serviceName'); + expect(getOrderParams('exceptionCount')).toBe('exceptionCount'); + expect(getOrderParams('lastSeen')).toBe('lastSeen'); + expect(getOrderParams('firstSeen')).toBe('firstSeen'); + expect(getOrderParams('exceptionType')).toBe('exceptionType'); + }); + + test('OffSet', () => { + expect(getOffSet(null)).toBe(0); + expect(getOffSet('')).toBe(0); + expect(getOffSet('0')).toBe(0); + expect(getOffSet('1')).toBe(1); + expect(getOffSet('10')).toBe(10); + expect(getOffSet('11')).toBe(11); + expect(getOffSet('100')).toBe(100); + expect(getOffSet('101')).toBe(101); + }); + + test('Order', () => { + expect(getOrder(null)).toBe('ascending'); + expect(getOrder('')).toBe('ascending'); + expect(getOrder('ascending')).toBe('ascending'); + expect(getOrder('descending')).toBe('descending'); + }); }); From e22be60a9ea4463f390e1af0931268ec3451b4a9 Mon Sep 17 00:00:00 2001 From: Palash Date: Fri, 15 Jul 2022 13:01:29 +0530 Subject: [PATCH 35/43] Create dependency-review.yml (#1360) * Create dependency-review.yml --- .github/workflows/dependency-review.yml | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 .github/workflows/dependency-review.yml diff --git a/.github/workflows/dependency-review.yml b/.github/workflows/dependency-review.yml new file mode 100644 index 0000000000..053a8733dc --- /dev/null +++ b/.github/workflows/dependency-review.yml @@ -0,0 +1,22 @@ +# Dependency Review Action +# +# This Action will scan dependency manifest files that change as part of a Pull Request, surfacing known-vulnerable versions of the packages declared or updated in the PR. Once installed, if the workflow run is marked as required, PRs introducing known-vulnerable packages will be blocked from merging. +# +# Source repository: https://github.com/actions/dependency-review-action +# Public documentation: https://docs.github.com/en/code-security/supply-chain-security/understanding-your-software-supply-chain/about-dependency-review#dependency-review-enforcement +name: 'Dependency Review' +on: [pull_request] + +permissions: + contents: read + +jobs: + dependency-review: + runs-on: ubuntu-latest + steps: + - name: 'Checkout Repository' + uses: actions/checkout@v3 + - name: 'Dependency Review' + with: + fail-on-severity: high + uses: actions/dependency-review-action@v2 From 964b819f20df44a15972266f3b3be87525a91e73 Mon Sep 17 00:00:00 2001 From: Ankit Anand <83692067+ankit01-oss@users.noreply.github.com> Date: Fri, 15 Jul 2022 13:38:39 +0530 Subject: [PATCH 36/43] Update CONTRIBUTING.md (#1) --- CONTRIBUTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index cff1047baf..80e07b7522 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -15,7 +15,7 @@ Please read through this document before submitting any issues or pull requests ## Finding contributions to work on πŸ’¬ Looking at the existing issues is a great way to find something to contribute on. -Also, have a look at these [good first issues labels](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with. +Also, have a look at these [good first issues label](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with. ## Sections: From b9d63d6b8fae3fb43871c016ccc36b48769d41ff Mon Sep 17 00:00:00 2001 From: Palash Date: Fri, 15 Jul 2022 14:17:29 +0530 Subject: [PATCH 37/43] feat: text is now ellipsed (#1392) * feat: text is now ellipsed --- .../SelectedSpanDetails/EllipsedButton.tsx | 53 ++++++++++++++ .../SelectedSpanDetails/ErrorTag.tsx | 72 +++++++------------ .../TraceDetail/SelectedSpanDetails/index.tsx | 72 +++++++++++++++++-- .../TraceDetail/SelectedSpanDetails/styles.ts | 18 ++++- 4 files changed, 158 insertions(+), 57 deletions(-) create mode 100644 frontend/src/container/TraceDetail/SelectedSpanDetails/EllipsedButton.tsx diff --git a/frontend/src/container/TraceDetail/SelectedSpanDetails/EllipsedButton.tsx b/frontend/src/container/TraceDetail/SelectedSpanDetails/EllipsedButton.tsx new file mode 100644 index 0000000000..56ef64e4ee --- /dev/null +++ b/frontend/src/container/TraceDetail/SelectedSpanDetails/EllipsedButton.tsx @@ -0,0 +1,53 @@ +import { StyledButton } from 'components/Styled'; +import React from 'react'; + +import { styles } from './styles'; + +function EllipsedButton({ + onToggleHandler, + setText, + value, + event, + buttonText, +}: Props): JSX.Element { + const isFullValueButton = buttonText === 'View full value'; + + const style = [styles.removePadding]; + + if (!isFullValueButton) { + style.push(styles.removeMargin); + } else { + style.push(styles.selectedSpanDetailsContainer); + style.push(styles.buttonContainer); + } + + return ( + { + onToggleHandler(true); + setText({ + subText: value, + text: event, + }); + }} + type="link" + > + {buttonText} + + ); +} + +interface Props { + onToggleHandler: (isOpen: boolean) => void; + setText: (text: { subText: string; text: string }) => void; + value: string; + event: string; + buttonText?: string; +} + +EllipsedButton.defaultProps = { + buttonText: 'View full log event message', +}; + +export default EllipsedButton; diff --git a/frontend/src/container/TraceDetail/SelectedSpanDetails/ErrorTag.tsx b/frontend/src/container/TraceDetail/SelectedSpanDetails/ErrorTag.tsx index 2a663387a5..69b51b3cd8 100644 --- a/frontend/src/container/TraceDetail/SelectedSpanDetails/ErrorTag.tsx +++ b/frontend/src/container/TraceDetail/SelectedSpanDetails/ErrorTag.tsx @@ -1,29 +1,22 @@ -import { Collapse, Modal } from 'antd'; -import Editor from 'components/Editor'; -import { StyledButton } from 'components/Styled'; +import { Collapse } from 'antd'; import useThemeMode from 'hooks/useThemeMode'; import keys from 'lodash-es/keys'; import map from 'lodash-es/map'; -import React, { useState } from 'react'; +import React from 'react'; import { ITraceTree } from 'types/api/trace/getTraceItem'; -import { CustomSubText, CustomSubTitle, styles } from './styles'; +import EllipsedButton from './EllipsedButton'; +import { CustomSubText, CustomSubTitle } from './styles'; const { Panel } = Collapse; -function ErrorTag({ event }: ErrorTagProps): JSX.Element { - const [isOpen, setIsOpen] = useState(false); +function ErrorTag({ + event, + onToggleHandler, + setText, +}: ErrorTagProps): JSX.Element { const { isDarkMode } = useThemeMode(); - const [text, setText] = useState({ - text: '', - subText: '', - }); - - const onToggleHandler = (state: boolean): void => { - setIsOpen(state); - }; - return ( <> {map(event, ({ attributeMap, name }) => { @@ -45,23 +38,23 @@ function ErrorTag({ event }: ErrorTagProps): JSX.Element { return ( <> {event} - + {value}
{isEllipsed && ( - { - onToggleHandler(true); - setText({ - subText: value, - text: event, - }); + - View full log event message - + /> )}
@@ -71,31 +64,14 @@ function ErrorTag({ event }: ErrorTagProps): JSX.Element { ); })} - - onToggleHandler(false)} - title="Log Message" - visible={isOpen} - destroyOnClose - footer={[]} - width="70vw" - > - {text.text} - - {text.text === 'exception.stacktrace' ? ( - {}} readOnly value={text.subText} /> - ) : ( - - {text.subText} - - )} - ); } interface ErrorTagProps { event: ITraceTree['event']; + onToggleHandler: (isOpen: boolean) => void; + setText: (text: { subText: string; text: string }) => void; } export default ErrorTag; diff --git a/frontend/src/container/TraceDetail/SelectedSpanDetails/index.tsx b/frontend/src/container/TraceDetail/SelectedSpanDetails/index.tsx index 08d6c057a9..49596d14d0 100644 --- a/frontend/src/container/TraceDetail/SelectedSpanDetails/index.tsx +++ b/frontend/src/container/TraceDetail/SelectedSpanDetails/index.tsx @@ -1,9 +1,11 @@ -import { Tabs, Tooltip, Typography } from 'antd'; +import { Modal, Tabs, Tooltip, Typography } from 'antd'; +import Editor from 'components/Editor'; import { StyledSpace } from 'components/Styled'; import useThemeMode from 'hooks/useThemeMode'; -import React, { useMemo } from 'react'; +import React, { useMemo, useState } from 'react'; import { ITraceTree } from 'types/api/trace/getTraceItem'; +import EllipsedButton from './EllipsedButton'; import ErrorTag from './ErrorTag'; import { CardContainer, @@ -12,6 +14,7 @@ import { CustomText, CustomTitle, styles, + SubTextContainer, } from './styles'; const { TabPane } = Tabs; @@ -26,6 +29,17 @@ function SelectedSpanDetails(props: SelectedSpanDetailsProps): JSX.Element { tree?.serviceName, ]); + const [isOpen, setIsOpen] = useState(false); + + const [text, setText] = useState({ + text: '', + subText: '', + }); + + const onToggleHandler = (state: boolean): void => { + setIsOpen(state); + }; + if (!tree) { return
; } @@ -52,18 +66,60 @@ function SelectedSpanDetails(props: SelectedSpanDetailsProps): JSX.Element { + onToggleHandler(false)} + title={text.text} + visible={isOpen} + destroyOnClose + footer={[]} + width="70vw" + centered + > + {text.text === 'exception.stacktrace' ? ( + {}} readOnly value={text.subText} /> + ) : ( + + {text.subText} + + )} + + {tags.length !== 0 ? ( tags.map((tags) => { + const value = tags.key === 'error' ? 'true' : tags.value; + const isEllipsed = value.length > 24; + return ( {tags.value && ( <> {tags.key} - - {tags.key === 'error' ? 'true' : tags.value} - + + value}> + + {value} + + + {isEllipsed && ( + + )} + + )} @@ -75,7 +131,11 @@ function SelectedSpanDetails(props: SelectedSpanDetailsProps): JSX.Element { {tree.event && Object.keys(tree.event).length !== 0 ? ( - + ) : ( No events data in selected span )} diff --git a/frontend/src/container/TraceDetail/SelectedSpanDetails/styles.ts b/frontend/src/container/TraceDetail/SelectedSpanDetails/styles.ts index d8bae86ba7..3c9180dc94 100644 --- a/frontend/src/container/TraceDetail/SelectedSpanDetails/styles.ts +++ b/frontend/src/container/TraceDetail/SelectedSpanDetails/styles.ts @@ -18,7 +18,8 @@ export const CustomText = styled(Paragraph)` export const CustomSubTitle = styled(Title)` &&& { font-size: 14px; - margin-bottom: 8px; + margin-bottom: 0.1rem; + margin-top: 0.5rem; } `; @@ -26,13 +27,19 @@ interface CustomSubTextProps { isDarkMode: boolean; } +export const SubTextContainer = styled.div` + &&& { + background: ${({ isDarkMode }): string => (isDarkMode ? '#444' : '#ddd')}; + } +`; + export const CustomSubText = styled(Paragraph)` &&& { background: ${({ isDarkMode }): string => (isDarkMode ? '#444' : '#ddd')}; font-size: 12px; - padding: 6px 8px; + padding: 4px 8px; word-break: break-all; - margin-bottom: 16px; + margin-bottom: 0rem; } `; @@ -81,10 +88,15 @@ const overflow = css` } `; +const buttonContainer = css` + height: 1.5rem; +`; + export const styles = { removeMargin, removePadding, selectedSpanDetailsContainer, spanEventsTabsContainer, overflow, + buttonContainer, }; From e4883495c3af6a9ed800dc9ce09838f6108313ca Mon Sep 17 00:00:00 2001 From: Prashant Shahi Date: Fri, 15 Jul 2022 16:40:45 +0530 Subject: [PATCH 38/43] =?UTF-8?q?fix(exceptions-page):=20=F0=9F=9A=91=20un?= =?UTF-8?q?ix=20nanoseconds=20operations=20(#1403)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Signed-off-by: Prashant Shahi --- frontend/src/container/AllError/utils.ts | 6 +++--- frontend/src/container/ErrorDetails/index.tsx | 6 +++--- frontend/yarn.lock | 5 +++++ 3 files changed, 11 insertions(+), 6 deletions(-) diff --git a/frontend/src/container/AllError/utils.ts b/frontend/src/container/AllError/utils.ts index 747c75cf58..239d404b1c 100644 --- a/frontend/src/container/AllError/utils.ts +++ b/frontend/src/container/AllError/utils.ts @@ -74,10 +74,10 @@ export const getDefaultOrder = ( return undefined; }; -export const getNanoSeconds = (date: string): number => { +export const getNanoSeconds = (date: string): string => { return ( - parseInt((new Date(date).getTime() / 1e3).toString(), 10) * 1e9 + - Timestamp.fromString(date).getNano() + Math.floor(new Date(date).getTime() / 1e3).toString() + + Timestamp.fromString(date).getNano().toString() ); }; diff --git a/frontend/src/container/ErrorDetails/index.tsx b/frontend/src/container/ErrorDetails/index.tsx index a200744890..d42d2e4a3e 100644 --- a/frontend/src/container/ErrorDetails/index.tsx +++ b/frontend/src/container/ErrorDetails/index.tsx @@ -40,7 +40,7 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element { getNextPrevId({ errorID: errorId || idPayload.errorId, groupID: idPayload.groupID, - timestamp: timestamp || getNanoSeconds(idPayload.timestamp).toString(), + timestamp: timestamp || getNanoSeconds(idPayload.timestamp), }), }, ); @@ -79,7 +79,7 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element { const onClickErrorIdHandler = async ( id: string, - timespamp: string, + timestamp: string, ): Promise => { try { if (id.length === 0) { @@ -92,7 +92,7 @@ function ErrorDetails(props: ErrorDetailsProps): JSX.Element { history.replace( `${history.location.pathname}?&groupId=${ idPayload.groupID - }×tamp=${getNanoSeconds(timespamp)}&errorId=${id}`, + }×tamp=${getNanoSeconds(timestamp)}&errorId=${id}`, ); } catch (error) { notification.error({ diff --git a/frontend/yarn.lock b/frontend/yarn.lock index 93373097a1..f2d9ad04ad 100644 --- a/frontend/yarn.lock +++ b/frontend/yarn.lock @@ -4093,6 +4093,11 @@ chartjs-adapter-date-fns@^2.0.0: resolved "https://registry.yarnpkg.com/chartjs-adapter-date-fns/-/chartjs-adapter-date-fns-2.0.0.tgz#5e53b2f660b993698f936f509c86dddf9ed44c6b" integrity sha512-rmZINGLe+9IiiEB0kb57vH3UugAtYw33anRiw5kS2Tu87agpetDDoouquycWc9pRsKtQo5j+vLsYHyr8etAvFw== +chartjs-plugin-annotation@^1.4.0: + version "1.4.0" + resolved "https://registry.yarnpkg.com/chartjs-plugin-annotation/-/chartjs-plugin-annotation-1.4.0.tgz#4c84cec1ec838bc09712f3686237866e6c3f4798" + integrity sha512-OC0eGoVvdxTtGGi8mV3Dr+G1YmMhtYYQWqGMb2uWcgcnyiBslaRKPofKwAYWPbh7ABnmQNsNDQLIKPH+XiaZLA== + "chokidar@>=3.0.0 <4.0.0", chokidar@^3.5.3: version "3.5.3" resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-3.5.3.tgz#1cf37c8707b932bd1af1ae22c0432e2acd1903bd" From 10c6325e467895f8ea11aa070eeb514fe5cff155 Mon Sep 17 00:00:00 2001 From: Prashant Shahi Date: Fri, 15 Jul 2022 17:10:27 +0530 Subject: [PATCH 39/43] =?UTF-8?q?chore(clickhouse):=20=F0=9F=94=8A=20updat?= =?UTF-8?q?e=20logging=20level=20to=20info=20(#1401)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Signed-off-by: Prashant Shahi --- .../clickhouse-setup/clickhouse-config.xml | 2 +- .../clickhouse-setup/clickhouse-config.xml | 2 +- deploy/docker/clickhouse-setup/config.xml | 1304 ----------------- .../tests/test-deploy/clickhouse-config.xml | 2 +- 4 files changed, 3 insertions(+), 1307 deletions(-) delete mode 100644 deploy/docker/clickhouse-setup/config.xml diff --git a/deploy/docker-swarm/clickhouse-setup/clickhouse-config.xml b/deploy/docker-swarm/clickhouse-setup/clickhouse-config.xml index 3bb26a3a36..4a6a82b8af 100644 --- a/deploy/docker-swarm/clickhouse-setup/clickhouse-config.xml +++ b/deploy/docker-swarm/clickhouse-setup/clickhouse-config.xml @@ -22,7 +22,7 @@ [1]: https://github.com/pocoproject/poco/blob/poco-1.9.4-release/Foundation/include/Poco/Logger.h#L105-L114 --> - trace + information /var/log/clickhouse-server/clickhouse-server.log /var/log/clickhouse-server/clickhouse-server.err.log - trace + information /var/log/clickhouse-server/clickhouse-server.log /var/log/clickhouse-server/clickhouse-server.err.log - - - - trace - /var/log/clickhouse-server/clickhouse-server.log - /var/log/clickhouse-server/clickhouse-server.err.log - - 1000M - 10 - - - - - - - - - - - - - - - - - - 8123 - - - 9000 - - - 9004 - - - 9005 - - - - - - - - - - - - 9009 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 4096 - - - 3 - - - - - false - - - /path/to/ssl_cert_file - /path/to/ssl_key_file - - - false - - - /path/to/ssl_ca_cert_file - - - none - - - 0 - - - -1 - -1 - - - false - - - - - - - /etc/clickhouse-server/server.crt - /etc/clickhouse-server/server.key - - - none - true - true - sslv2,sslv3 - true - - - - true - true - sslv2,sslv3 - true - - - - RejectCertificateHandler - - - - - - - - - 100 - - - 0 - - - - 10000 - - - - - - 0.9 - - - 4194304 - - - 0 - - - - - - 8589934592 - - - 5368709120 - - - - 1000 - - - 134217728 - - - 10000 - - - /var/lib/clickhouse/ - - - /var/lib/clickhouse/tmp/ - - - - ` - - - - - - /var/lib/clickhouse/user_files/ - - - - - - - - - - - - - users.xml - - - - /var/lib/clickhouse/access/ - - - - - - - default - - - - - - - - - - - - default - - - - - - - - - true - - - false - - ' | sed -e 's|.*>\(.*\)<.*|\1|') - wget https://github.com/ClickHouse/clickhouse-jdbc-bridge/releases/download/v$PKG_VER/clickhouse-jdbc-bridge_$PKG_VER-1_all.deb - apt install --no-install-recommends -f ./clickhouse-jdbc-bridge_$PKG_VER-1_all.deb - clickhouse-jdbc-bridge & - - * [CentOS/RHEL] - export MVN_URL=https://repo1.maven.org/maven2/ru/yandex/clickhouse/clickhouse-jdbc-bridge - export PKG_VER=$(curl -sL $MVN_URL/maven-metadata.xml | grep '' | sed -e 's|.*>\(.*\)<.*|\1|') - wget https://github.com/ClickHouse/clickhouse-jdbc-bridge/releases/download/v$PKG_VER/clickhouse-jdbc-bridge-$PKG_VER-1.noarch.rpm - yum localinstall -y clickhouse-jdbc-bridge-$PKG_VER-1.noarch.rpm - clickhouse-jdbc-bridge & - - Please refer to https://github.com/ClickHouse/clickhouse-jdbc-bridge#usage for more information. - ]]> - - - - - - - - - - - - - - - - localhost - 9000 - - - - - - - - false - - 127.0.0.1 - 9000 - - - 127.0.0.2 - 9000 - - - 127.0.0.3 - 9000 - - - - - - - - localhost - 9000 - - - - - localhost - 9000 - - - - - - - 127.0.0.1 - 9000 - - - - - 127.0.0.2 - 9000 - - - - - - true - - 127.0.0.1 - 9000 - - - - true - - 127.0.0.2 - 9000 - - - - - - - localhost - 9440 - 1 - - - - - - - localhost - 9000 - - - - - localhost - 1 - - - - - - - - - - - - - - - - - - - - - - - - 3600 - - - - 3600 - - - 60 - - - - - - - - - - - - - system -
query_log
- - toYYYYMM(event_date) - - - - - - 7500 - - - - - system - trace_log
- - toYYYYMM(event_date) - 7500 -
- - - - system - query_thread_log
- toYYYYMM(event_date) - 7500 -
- - - - system - query_views_log
- toYYYYMM(event_date) - 7500 -
- - - - system - part_log
- toYYYYMM(event_date) - 7500 -
- - - - - - system - metric_log
- 7500 - 1000 -
- - - - system - asynchronous_metric_log
- - 7000 -
- - - - - - engine MergeTree - partition by toYYYYMM(finish_date) - order by (finish_date, finish_time_us, trace_id) - - system - opentelemetry_span_log
- 7500 -
- - - - - system - crash_log
- - - 1000 -
- - - - - - - system - processors_profile_log
- - toYYYYMM(event_date) - 7500 -
- - - - - - - - - *_dictionary.xml - - - *_function.xml - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - /clickhouse/task_queue/ddl - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - click_cost - any - - 0 - 3600 - - - 86400 - 60 - - - - max - - 0 - 60 - - - 3600 - 300 - - - 86400 - 3600 - - - - - - /var/lib/clickhouse/format_schemas/ - - - - - hide encrypt/decrypt arguments - ((?:aes_)?(?:encrypt|decrypt)(?:_mysql)?)\s*\(\s*(?:'(?:\\'|.)+'|.*?)\s*\) - - \1(???) - - - - - - - - - - false - - false - - - https://6f33034cfe684dd7a3ab9875e57b1c8d@o388870.ingest.sentry.io/5226277 - - - - - - - - - - - 268435456 - true - - diff --git a/pkg/query-service/tests/test-deploy/clickhouse-config.xml b/pkg/query-service/tests/test-deploy/clickhouse-config.xml index 3bb26a3a36..4a6a82b8af 100644 --- a/pkg/query-service/tests/test-deploy/clickhouse-config.xml +++ b/pkg/query-service/tests/test-deploy/clickhouse-config.xml @@ -22,7 +22,7 @@ [1]: https://github.com/pocoproject/poco/blob/poco-1.9.4-release/Foundation/include/Poco/Logger.h#L105-L114 --> - trace + information /var/log/clickhouse-server/clickhouse-server.log /var/log/clickhouse-server/clickhouse-server.err.log