Merge pull request #1480 from SigNoz/release/v0.10.1

Release/v0.10.1
This commit is contained in:
Ankit Nayan 2022-08-07 15:35:29 +05:30 committed by GitHub
commit 8f9d0f2403
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
122 changed files with 2431 additions and 897 deletions

2
.github/CODEOWNERS vendored
View File

@ -4,4 +4,4 @@
* @ankitnayan * @ankitnayan
/frontend/ @palashgdev @pranshuchittora /frontend/ @palashgdev @pranshuchittora
/deploy/ @prashant-shahi /deploy/ @prashant-shahi
/pkg/query-service/ @srikanthccv @makeavish @nityanandagohain /pkg/query-service/ @srikanthccv

View File

@ -1,122 +1,331 @@
# How to Contribute # Contributing Guidelines
There are primarily 2 areas in which you can contribute in SigNoz ## Welcome to SigNoz Contributing section 🎉
- Frontend ( written in Typescript, React) Hi there! We're thrilled that you'd like to contribute to this project, thank you for your interest. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community.
- Backend - ( Query Service - written in Go)
Depending upon your area of expertise & interest, you can chose one or more to contribute. Below are detailed instructions to contribute in each area Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution.
> Please note: If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. 🙏🏻 - We accept contributions made to the [SigNoz `develop` branch]()
- Find all SigNoz Docker Hub images here
- [signoz/frontend](https://hub.docker.com/r/signoz/frontend)
- [signoz/query-service](https://hub.docker.com/r/signoz/query-service)
- [signoz/otelcontribcol](https://hub.docker.com/r/signoz/otelcontribcol)
> If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted. ## Finding contributions to work on 💬
# Develop Frontend Looking at the existing issues is a great way to find something to contribute on.
Also, have a look at these [good first issues label](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with.
Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend)
### Contribute to Frontend with Docker installation of SigNoz ## Sections:
- [General Instructions](#1-general-instructions-)
- [For Creating Issue(s)](#11-for-creating-issues)
- [For Pull Requests(s)](#12-for-pull-requests)
- [How to Contribute](#2-how-to-contribute-%EF%B8%8F)
- [Develop Frontend](#3-develop-frontend-)
- [Contribute to Frontend with Docker installation of SigNoz](#31-contribute-to-frontend-with-docker-installation-of-signoz)
- [Contribute to Frontend without installing SigNoz backend](#32-contribute-to-frontend-without-installing-signoz-backend)
- [Contribute to Backend (Query-Service)](#4-contribute-to-backend-query-service-)
- [To run ClickHouse setup](#41-to-run-clickhouse-setup-recommended-for-local-development)
- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart-)
- [To run helm chart for local development](#51-to-run-helm-chart-for-local-development)
- [Other Ways to Contribute](#other-ways-to-contribute)
- `git clone https://github.com/SigNoz/signoz.git && cd signoz` # 1. General Instructions 📝
- comment out frontend service section at `deploy/docker/clickhouse-setup/docker-compose.yaml#L62`
- run `cd deploy` to move to deploy directory
- Install signoz locally without the frontend
- Add below configuration to query-service section at `docker/clickhouse-setup/docker-compose.yaml#L38`
```docker ## 1.1 For Creating Issue(s)
Before making any significant changes and before filing a new issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed) issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can.
**Issue Types** - [Bug Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=) | [Feature Request](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=) | [Performance Issue Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=performance-issue-report.md&title=) | [Report a Security Vulnerability](https://github.com/SigNoz/signoz/security/policy)
#### Details like these are incredibly useful:
- **Requirement** - what kind of use case are you trying to solve?
- **Proposal** - what do you suggest to solve the problem or improve the existing
situation?
- Any open questions to address❓
#### If you are reporting a bug, details like these are incredibly useful:
- A reproducible test case or series of steps.
- The version of our code being used.
- Any modifications you've made relevant to the bug🐞.
- Anything unusual about your environment or deployment.
Discussing your proposed changes ahead of time will make the contribution
process smooth for everyone 🙌.
**[`^top^`](#)**
<hr>
## 1.2 For Pull Request(s)
Contributions via pull requests are much appreciated. Once the approach is agreed upon ✅, make your changes and open a Pull Request(s).
Before sending us a pull request, please ensure that,
- Fork the SigNoz repo on GitHub, clone it on your machine.
- Create a branch with your changes.
- You are working against the latest source on the `develop` branch.
- Modify the source; please focus only on the specific change you are contributing.
- Ensure local tests pass.
- Commit to your fork using clear commit messages.
- Send us a pull request, answering any default questions in the pull request interface.
- Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation
- Once you've pushed your commits to GitHub, make sure that your branch can be auto-merged (there are no merge conflicts). If not, on your computer, merge main into your branch, resolve any merge conflicts, make sure everything still runs correctly and passes all the tests, and then push up those changes.
- Once the change has been approved and merged, we will inform you in a comment.
GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and
[creating a pull request](https://help.github.com/articles/creating-a-pull-request/).
**Note:** Unless your change is small, **please** consider submitting different Pull Rrequest(s):
* 1⃣ First PR should include the overall structure of the new component:
* Readme, configuration, interfaces or base classes, etc...
* This PR is usually trivial to review, so the size limit does not apply to
it.
* 2⃣ Second PR should include the concrete implementation of the component. If the
size of this PR is larger than the recommended size, consider **splitting** ⚔️ it into
multiple PRs.
* If there are multiple sub-component then ideally each one should be implemented as
a **separate** pull request.
* Last PR should include changes to **any user-facing documentation.** And should include
end-to-end tests if applicable. The component must be enabled
only after sufficient testing, and there is enough confidence in the
stability and quality of the component.
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack).
### Pointers:
- If you find any **bugs** → please create an [**issue.**](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=)
- If you find anything **missing** in documentation → you can create an issue with the label **`documentation`**.
- If you want to build any **new feature** → please create an [issue with the label **`enhancement`**.](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=)
- If you want to **discuss** something about the product, start a new [**discussion**.](https://github.com/SigNoz/signoz/discussions)
<hr>
### Conventions to follow when submitting Commits and Pull Request(s).
We try to follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/), more specifically the commits and PRs **should have type specifiers** prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea.
e.g. If you are submitting a fix for an issue in frontend, the PR name should be prefixed with **`fix(FE):`**
- Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows.
- Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
**[`^top^`](#)**
<hr>
# 2. How to Contribute 🙋🏻‍♂️
#### There are primarily 2 areas in which you can contribute to SigNoz
- [**Frontend**](#3-develop-frontend-) (Written in Typescript, React)
- [**Backend**](#4-contribute-to-backend-query-service-) (Query Service, written in Go)
Depending upon your area of expertise & interest, you can choose one or more to contribute. Below are detailed instructions to contribute in each area.
**Please note:** If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. 🙏🏻
⚠️ If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted.
**[`^top^`](#)**
<hr>
# 3. Develop Frontend 🌚
**Need to Update: [https://github.com/SigNoz/signoz/tree/develop/frontend](https://github.com/SigNoz/signoz/tree/develop/frontend)**
Also, have a look at [Frontend README.md](https://github.com/SigNoz/signoz/blob/develop/frontend/README.md) sections for more info on how to setup SigNoz frontend locally (with and without Docker).
## 3.1 Contribute to Frontend with Docker installation of SigNoz
- Clone the SigNoz repository and cd into signoz directory,
```
git clone https://github.com/SigNoz/signoz.git && cd signoz
```
- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68)
![develop-frontend](https://user-images.githubusercontent.com/52788043/179009217-6692616b-17dc-4d27-b587-9d007098d739.jpeg)
- run `cd deploy` to move to deploy directory,
- Install signoz locally **without** the frontend,
- Add / Uncomment the below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L47)
```
ports: ports:
- "8080:8080" - "8080:8080"
``` ```
- If you are using x86_64 processors (All Intel/AMD processors) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d` <img width="869" alt="query service" src="https://user-images.githubusercontent.com/52788043/179010251-8489be31-04ca-42f8-b30d-ef0bb6accb6b.png">
- If you are on arm64 processors (Apple M1 Macbooks) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d`
- `cd ../frontend` and change baseURL to `http://localhost:8080` in file `src/constants/env.ts` - Next run,
- `yarn install` ```
- `yarn dev` sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d
```
- `cd ../frontend` and change baseURL in file [`frontend/src/constants/env.ts#L2`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts#L2) and for that, you need to create a `.env` file in the `frontend` directory with the following environment variable (`FRONTEND_API_ENDPOINT`) matching your configuration.
> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` If you have backend api exposed via frontend nginx:
```
FRONTEND_API_ENDPOINT=http://localhost:3301
```
If not:
```
FRONTEND_API_ENDPOINT=http://localhost:8080
```
### Contribute to Frontend without installing SigNoz backend - Next,
```
yarn install
yarn dev
```
If you don't want to install SigNoz backend just for doing frontend development, we can provide you with test environments which you can use as the backend. Please ping us in #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `<test environment URL>` ### Important Notes:
The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh)
- `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend` **[`^top^`](#)**
- Create a file `.env` with `FRONTEND_API_ENDPOINT=<test environment URL>`
- `yarn install`
- `yarn dev`
**_Frontend should now be accessible at `http://localhost:3301/application`_** ## 3.2 Contribute to Frontend without installing SigNoz backend
# Contribute to Query-Service If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend.
Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](https://github.com/SigNoz/signoz/tree/main/pkg/query-service) - Clone the SigNoz repository and cd into signoz/frontend directory,
```
git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend
````
- Create a file `.env` in the `frontend` directory with `FRONTEND_API_ENDPOINT=<test environment URL>`
- Next,
```
yarn install
yarn dev
```
### To run ClickHouse setup (recommended for local development) Please ping us in the [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) channel or ask `@Prashant Shahi` in our [Slack Community](https://signoz.io/slack) and we will DM you with `<test environment URL>`.
- git clone https://github.com/SigNoz/signoz.git **Frontend should now be accessible at** [`http://localhost:3301/application`](http://localhost:3301/application)
- run `cd signoz` to move to signoz directory
- run `sudo make dev-setup` to configure local setup to run query-service **[`^top^`](#)**
- comment out frontend service section at `docker/clickhouse-setup/docker-compose.yaml`
- comment out query-service section at `docker/clickhouse-setup/docker-compose.yaml` <hr>
- add below configuration to clickhouse section at `docker/clickhouse-setup/docker-compose.yaml`
```docker # 4. Contribute to Backend (Query-Service) 🌑
expose:
- 9000 [**https://github.com/SigNoz/signoz/tree/develop/pkg/query-service**](https://github.com/SigNoz/signoz/tree/develop/pkg/query-service)
ports:
- 9001:9000 ## 4.1 To run ClickHouse setup (recommended for local development)
- Clone the SigNoz repository and cd into signoz directory,
```
git clone https://github.com/SigNoz/signoz.git && cd signoz
```
- run `sudo make dev-setup` to configure local setup to run query-service,
- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68)
<img width="982" alt="develop-frontend" src="https://user-images.githubusercontent.com/52788043/179043977-012be8b0-a2ed-40d1-b2e6-2ab72d7989c0.png">
- Comment out `query-service` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L41`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L41)
<img width="1068" alt="Screenshot 2022-07-14 at 22 48 07" src="https://user-images.githubusercontent.com/52788043/179044151-a65ba571-db0b-4a16-b64b-ca3fadcf3af0.png">
- add below configuration to `clickhouse` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml)
```
ports:
- 9001:9000
```
<img width="1013" alt="Screenshot 2022-07-14 at 22 50 37" src="https://user-images.githubusercontent.com/52788043/179044544-a293d3bc-4c4f-49ea-a276-505a381de67d.png">
- run `cd pkg/query-service/` to move to `query-service` directory,
- Then, you need to create a `.env` file with the following environment variable
```
SIGNOZ_LOCAL_DB_PATH="./signoz.db"
```
to set your local environment with the right `RELATIONAL_DATASOURCE_PATH` as mentioned in [`./constants/constants.go#L38`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go#L38)
- Now, install SigNoz locally **without** the `frontend` and `query-service`,
- If you are using `x86_64` processors (All Intel/AMD processors) run `sudo make run-x86`
- If you are on `arm64` processors (Apple M1 Macs) run `sudo make run-arm`
#### Run locally,
``` ```
- run `cd pkg/query-service/` to move to query-service directory
- Open ./constants/constants.go
- Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \
with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".```
- Install signoz locally without the frontend and query-service
- If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86`
- If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm`
#### Run locally
```console
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go
``` ```
> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` #### Build and Run locally
```
cd pkg/query-service
go build -o build/query-service main.go
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service
```
**_Query Service should now be available at `http://localhost:8080`_** #### Docker Images
The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service
> If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080` ```
docker pull signoz/query-service
```
```
docker pull signoz/query-service:latest
```
```
docker pull signoz/query-service:develop
```
### Important Note:
The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh)
**Query Service should now be available at** [`http://localhost:8080`](http://localhost:8080)
If you want to see how the frontend plays with query service, you can run the frontend also in your local env with the baseURL changed to `http://localhost:8080` in file [`frontend/src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) as the `query-service` is now running at port `8080`.
---
<!-- Instead of configuring a local setup, you can also use [Gitpod](https://www.gitpod.io/), a VSCode-based Web IDE. <!-- Instead of configuring a local setup, you can also use [Gitpod](https://www.gitpod.io/), a VSCode-based Web IDE.
Click the button below. A workspace with all required environments will be created. Click the button below. A workspace with all required environments will be created.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/SigNoz/signoz) [![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/SigNoz/signoz)
> To use it on your forked repo, edit the 'Open in Gitpod' button url to `https://gitpod.io/#https://github.com/<your-github-username>/signoz` --> > To use it on your forked repo, edit the 'Open in Gitpod' button URL to `https://gitpod.io/#https://github.com/<your-github-username>/signoz` -->
# Contribute to SigNoz Helm Chart **[`^top^`](#)**
<hr>
Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts). # 5. Contribute to SigNoz Helm Chart 📊
### To run helm chart for local development **Need to Update: [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts).**
- run `git clone https://github.com/SigNoz/charts.git` followed by `cd charts` ## 5.1 To run helm chart for local development
- it is recommended to use lightweight kubernetes (k8s) cluster for local development:
- Clone the SigNoz repository and cd into charts directory,
```
git clone https://github.com/SigNoz/charts.git && cd charts
```
- It is recommended to use lightweight kubernetes (k8s) cluster for local development:
- [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation) - [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation)
- [k3d](https://k3d.io/#installation) - [k3d](https://k3d.io/#installation)
- [minikube](https://minikube.sigs.k8s.io/docs/start/) - [minikube](https://minikube.sigs.k8s.io/docs/start/)
- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster - create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster,
- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace. - run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace,
- run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301) - next run,
```
kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301
```
to make SigNoz UI available at [localhost:3301](http://localhost:3301)
**To install HotROD sample app:** **5.1.1 To install the HotROD sample app:**
```bash ```bash
curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \ curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \
| HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash | HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash
``` ```
**To load data with HotROD sample app:** **5.1.2 To load data with the HotROD sample app:**
```bash ```bash
kubectl -n sample-application run strzal --image=djbingham/curl \ kubectl -n sample-application run strzal --image=djbingham/curl \
@ -124,7 +333,7 @@ kubectl -n sample-application run strzal --image=djbingham/curl \
'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm 'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm
``` ```
**To stop the load generation:** **5.1.3 To stop the load generation:**
```bash ```bash
kubectl -n sample-application run strzal --image=djbingham/curl \ kubectl -n sample-application run strzal --image=djbingham/curl \
@ -132,59 +341,32 @@ kubectl -n sample-application run strzal --image=djbingham/curl \
http://locust-master:8089/stop http://locust-master:8089/stop
``` ```
**To delete HotROD sample app:** **5.1.4 To delete the HotROD sample app:**
```bash ```bash
curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \ curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \
| HOTROD_NAMESPACE=sample-application bash | HOTROD_NAMESPACE=sample-application bash
``` ```
**[`^top^`](#)**
--- ---
## General Instructions ## Other Ways to Contribute
**Before making any significant changes, please open an issue**. Each issue There are many other ways to get involved with the community and to participate in this project:
should describe the following:
* Requirement - what kind of use case are you trying to solve? - Use the product, submitting GitHub issues when a problem is found.
* Proposal - what do you suggest to solve the problem or improve the existing - Help code review pull requests and participate in issue threads.
situation? - Submit a new feature request as an issue.
* Any open questions to address - Help answer questions on forums such as Stack Overflow and [SigNoz Community Slack Channel](https://signoz.io/slack).
- Tell others about the project on Twitter, your blog, etc.
Discussing your proposed changes ahead of time will make the contribution
process smooth for everyone. Once the approach is agreed upon, make your changes
and open a pull request(s). Unless your change is small, Please consider submitting different PRs:
* First PR should include the overall structure of the new component:
* Readme, configuration, interfaces or base classes etc...
* This PR is usually trivial to review, so the size limit does not apply to
it.
* Second PR should include the concrete implementation of the component. If the
size of this PR is larger than the recommended size consider splitting it in
multiple PRs.
* If there are multiple sub-component then ideally each one should be implemented as
a separate pull request.
* Last PR should include changes to any user facing documentation. And should include
end to end tests if applicable. The component must be enabled
only after sufficient testing, and there is enough confidence in the
stability and quality of the component.
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [slack](https://signoz.io/slack). ## License
- If you find any bugs, please create an issue By contributing to SigNoz, you agree that your contributions will be licensed under its MIT license.
- If you find anything missing in documentation, you can create an issue with label **documentation**
- If you want to build any new feature, please create an issue with label `enhancement`
- If you want to discuss something about the product, start a new [discussion](https://github.com/SigNoz/signoz/discussions)
### Conventions to follow when submitting commits, PRs Again, Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
1. We try to follow https://www.conventionalcommits.org/en/v1.0.0/ Thank You!
More specifically the commits and PRs should have type specifiers prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea.
e.g. If you are submitting a fix for an issue in frontend - PR name should be prefixed with `fix(FE):`
2. Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows
3. Feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :)

View File

@ -20,6 +20,7 @@
</default> </default>
<s3> <s3>
<disk>s3</disk> <disk>s3</disk>
<perform_ttl_move_on_insert>0</perform_ttl_move_on_insert>
</s3> </s3>
</volumes> </volumes>
</tiered> </tiered>

View File

@ -27,7 +27,7 @@ services:
retries: 3 retries: 3
alertmanager: alertmanager:
image: signoz/alertmanager:0.23.0-0.1 image: signoz/alertmanager:0.23.0-0.2
volumes: volumes:
- ./data/alertmanager:/data - ./data/alertmanager:/data
command: command:
@ -40,7 +40,7 @@ services:
condition: on-failure condition: on-failure
query-service: query-service:
image: signoz/query-service:0.10.0 image: signoz/query-service:0.10.1
command: ["-config=/root/config/prometheus.yml"] command: ["-config=/root/config/prometheus.yml"]
# ports: # ports:
# - "6060:6060" # pprof port # - "6060:6060" # pprof port
@ -68,7 +68,7 @@ services:
- clickhouse - clickhouse
frontend: frontend:
image: signoz/frontend:0.10.0 image: signoz/frontend:0.10.1
deploy: deploy:
restart_policy: restart_policy:
condition: on-failure condition: on-failure
@ -81,7 +81,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf - ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector: otel-collector:
image: signoz/otelcontribcol:0.45.1-1.1 image: signoz/otelcontribcol:0.45.1-1.3
command: ["--config=/etc/otel-collector-config.yaml"] command: ["--config=/etc/otel-collector-config.yaml"]
volumes: volumes:
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
@ -111,7 +111,7 @@ services:
- clickhouse - clickhouse
otel-collector-metrics: otel-collector-metrics:
image: signoz/otelcontribcol:0.45.1-1.1 image: signoz/otelcontribcol:0.45.1-1.3
command: ["--config=/etc/otel-collector-metrics-config.yaml"] command: ["--config=/etc/otel-collector-metrics-config.yaml"]
volumes: volumes:
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml - ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml

View File

@ -5,9 +5,11 @@ receivers:
# otel-collector internal metrics # otel-collector internal metrics
- job_name: "otel-collector" - job_name: "otel-collector"
scrape_interval: 60s scrape_interval: 60s
static_configs: dns_sd_configs:
- targets: - names:
- otel-collector:8888 - 'tasks.otel-collector'
type: 'A'
port: 8888
# otel-collector-metrics internal metrics # otel-collector-metrics internal metrics
- job_name: "otel-collector-metrics" - job_name: "otel-collector-metrics"
scrape_interval: 60s scrape_interval: 60s
@ -17,9 +19,11 @@ receivers:
# SigNoz span metrics # SigNoz span metrics
- job_name: "signozspanmetrics-collector" - job_name: "signozspanmetrics-collector"
scrape_interval: 60s scrape_interval: 60s
static_configs: dns_sd_configs:
- targets: - names:
- otel-collector:8889 - 'tasks.otel-collector'
type: 'A'
port: 8889
processors: processors:
batch: batch:

View File

@ -20,6 +20,7 @@
</default> </default>
<s3> <s3>
<disk>s3</disk> <disk>s3</disk>
<perform_ttl_move_on_insert>0</perform_ttl_move_on_insert>
</s3> </s3>
</volumes> </volumes>
</tiered> </tiered>

View File

@ -25,7 +25,7 @@ services:
retries: 3 retries: 3
alertmanager: alertmanager:
image: signoz/alertmanager:0.23.0-0.1 image: signoz/alertmanager:0.23.0-0.2
volumes: volumes:
- ./data/alertmanager:/data - ./data/alertmanager:/data
depends_on: depends_on:
@ -39,7 +39,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md` # Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
query-service: query-service:
image: signoz/query-service:0.10.0 image: signoz/query-service:0.10.1
container_name: query-service container_name: query-service
command: ["-config=/root/config/prometheus.yml"] command: ["-config=/root/config/prometheus.yml"]
# ports: # ports:
@ -66,7 +66,7 @@ services:
condition: service_healthy condition: service_healthy
frontend: frontend:
image: signoz/frontend:0.10.0 image: signoz/frontend:0.10.1
container_name: frontend container_name: frontend
restart: on-failure restart: on-failure
depends_on: depends_on:
@ -78,7 +78,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf - ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector: otel-collector:
image: signoz/otelcontribcol:0.45.1-1.1 image: signoz/otelcontribcol:0.45.1-1.3
command: ["--config=/etc/otel-collector-config.yaml"] command: ["--config=/etc/otel-collector-config.yaml"]
volumes: volumes:
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
@ -103,7 +103,7 @@ services:
condition: service_healthy condition: service_healthy
otel-collector-metrics: otel-collector-metrics:
image: signoz/otelcontribcol:0.45.1-1.1 image: signoz/otelcontribcol:0.45.1-1.3
command: ["--config=/etc/otel-collector-metrics-config.yaml"] command: ["--config=/etc/otel-collector-metrics-config.yaml"]
volumes: volumes:
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml - ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml

View File

@ -204,9 +204,14 @@ start_docker() {
echo "Starting docker service" echo "Starting docker service"
$sudo_cmd systemctl start docker.service $sudo_cmd systemctl start docker.service
fi fi
# if [[ -z $sudo_cmd ]]; then
# docker ps > /dev/null && true
# if [[ $? -ne 0 ]]; then
# request_sudo
# fi
# fi
if [[ -z $sudo_cmd ]]; then if [[ -z $sudo_cmd ]]; then
docker ps > /dev/null && true if ! docker ps > /dev/null && true; then
if [[ $? -ne 0 ]]; then
request_sudo request_sudo
fi fi
fi fi
@ -268,8 +273,12 @@ request_sudo() {
if (( $EUID != 0 )); then if (( $EUID != 0 )); then
sudo_cmd="sudo" sudo_cmd="sudo"
echo -e "Please enter your sudo password, if prompt." echo -e "Please enter your sudo password, if prompt."
$sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null # $sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null
if [[ $? -ne 0 ]] && ! $sudo_cmd -v; then # if [[ $? -ne 0 ]] && ! $sudo_cmd -v; then
# echo "Need sudo privileges to proceed with the installation."
# exit 1;
# fi
if ! $sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null && ! $sudo_cmd -v; then
echo "Need sudo privileges to proceed with the installation." echo "Need sudo privileges to proceed with the installation."
exit 1; exit 1;
fi fi
@ -303,8 +312,13 @@ echo -e "🌏 Detecting your OS ...\n"
check_os check_os
# Obtain unique installation id # Obtain unique installation id
sysinfo="$(uname -a)" # sysinfo="$(uname -a)"
if [[ $? -ne 0 ]]; then # if [[ $? -ne 0 ]]; then
# uuid="$(uuidgen)"
# uuid="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
# sysinfo="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
# fi
if ! sysinfo="$(uname -a)"; then
uuid="$(uuidgen)" uuid="$(uuidgen)"
uuid="${uuid:-$(cat /proc/sys/kernel/random/uuid)}" uuid="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
sysinfo="${uuid:-$(cat /proc/sys/kernel/random/uuid)}" sysinfo="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"

View File

@ -16,6 +16,7 @@
"playwright": "NODE_ENV=testing playwright test --config=./playwright.config.ts", "playwright": "NODE_ENV=testing playwright test --config=./playwright.config.ts",
"playwright:local:debug": "PWDEBUG=console yarn playwright --headed --browser=chromium", "playwright:local:debug": "PWDEBUG=console yarn playwright --headed --browser=chromium",
"playwright:codegen:local":"playwright codegen http://localhost:3301", "playwright:codegen:local":"playwright codegen http://localhost:3301",
"playwright:codegen:local:auth":"yarn playwright:codegen:local --load-storage=tests/auth.json",
"husky:configure": "cd .. && husky install frontend/.husky && cd frontend && chmod ug+x .husky/*", "husky:configure": "cd .. && husky install frontend/.husky && cd frontend && chmod ug+x .husky/*",
"commitlint": "commitlint --edit $1" "commitlint": "commitlint --edit $1"
}, },

View File

@ -14,8 +14,8 @@ const config: PlaywrightTestConfig = {
baseURL: process.env.PLAYWRIGHT_TEST_BASE_URL || 'http://localhost:3301', baseURL: process.env.PLAYWRIGHT_TEST_BASE_URL || 'http://localhost:3301',
}, },
updateSnapshots: 'all', updateSnapshots: 'all',
fullyParallel: false, fullyParallel: !!process.env.CI,
quiet: true, quiet: false,
testMatch: ['**/*.spec.ts'], testMatch: ['**/*.spec.ts'],
reporter: process.env.CI ? 'github' : 'list', reporter: process.env.CI ? 'github' : 'list',
}; };

View File

@ -1,4 +1,11 @@
{ {
"target_missing": "Please enter a threshold to proceed",
"rule_test_fired": "Test notification sent successfully",
"no_alerts_found": "No alerts found during the evaluation. This happens when rule condition is unsatisfied. You may adjust the rule threshold and retry.",
"button_testrule": "Test Notification",
"label_channel_select": "Notification Channels",
"placeholder_channel_select": "select one or more channels",
"channel_select_tooltip": "Leave empty to send this alert on all the configured channels",
"preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.", "preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.",
"preview_chart_threshold_label": "Threshold", "preview_chart_threshold_label": "Threshold",
"placeholder_label_key_pair": "Click here to enter a label (key value pairs)", "placeholder_label_key_pair": "Click here to enter a label (key value pairs)",

View File

@ -1,4 +1,14 @@
{ {
"channel_delete_unexp_error": "Something went wrong",
"channel_delete_success": "Channel Deleted Successfully",
"column_channel_name": "Name",
"column_channel_type": "Type",
"column_channel_action": "Action",
"column_channel_edit": "Edit",
"button_new_channel": "New Alert Channel",
"tooltip_notification_channels": "More details on how to setting notification channels",
"sending_channels_note": "The alerts will be sent to all the configured channels.",
"loading_channels_message": "Loading Channels..",
"page_title_create": "New Notification Channels", "page_title_create": "New Notification Channels",
"page_title_edit": "Edit Notification Channels", "page_title_edit": "Edit Notification Channels",
"button_save_channel": "Save", "button_save_channel": "Save",

View File

@ -1,4 +1,11 @@
{ {
"target_missing": "Please enter a threshold to proceed",
"rule_test_fired": "Test notification sent successfully",
"no_alerts_found": "No alerts found during the evaluation. This happens when rule condition is unsatisfied. You may adjust the rule threshold and retry.",
"button_testrule": "Test Notification",
"label_channel_select": "Notification Channels",
"placeholder_channel_select": "select one or more channels",
"channel_select_tooltip": "Leave empty to send this alert on all the configured channels",
"preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.", "preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.",
"preview_chart_threshold_label": "Threshold", "preview_chart_threshold_label": "Threshold",
"placeholder_label_key_pair": "Click here to enter a label (key value pairs)", "placeholder_label_key_pair": "Click here to enter a label (key value pairs)",

View File

@ -1,4 +1,14 @@
{ {
"channel_delete_unexp_error": "Something went wrong",
"channel_delete_success": "Channel Deleted Successfully",
"column_channel_name": "Name",
"column_channel_type": "Type",
"column_channel_action": "Action",
"column_channel_edit": "Edit",
"button_new_channel": "New Alert Channel",
"tooltip_notification_channels": "More details on how to setting notification channels",
"sending_channels_note": "The alerts will be sent to all the configured channels.",
"loading_channels_message": "Loading Channels..",
"page_title_create": "New Notification Channels", "page_title_create": "New Notification Channels",
"page_title_edit": "Edit Notification Channels", "page_title_edit": "Edit Notification Channels",
"button_save_channel": "Save", "button_save_channel": "Save",

View File

@ -0,0 +1,26 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { PayloadProps, Props } from 'types/api/alerts/patch';
const patch = async (
props: Props,
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
const response = await axios.patch(`/rules/${props.id}`, {
...props.data,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};
export default patch;

View File

@ -0,0 +1,26 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { PayloadProps, Props } from 'types/api/alerts/testAlert';
const testAlert = async (
props: Props,
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
const response = await axios.post('/testRule', {
...props.data,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};
export default testAlert;

View File

@ -0,0 +1,24 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getTopLevelOperations';
const getTopLevelOperations = async (
props: Props,
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
const response = await axios.post(`/service/top_level_operations`);
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data[props.service],
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};
export default getTopLevelOperations;

View File

@ -2,13 +2,13 @@ import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api'; import { ErrorResponse, SuccessResponse } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getTopEndPoints'; import { PayloadProps, Props } from 'types/api/metrics/getTopOperations';
const getTopEndPoints = async ( const getTopOperations = async (
props: Props, props: Props,
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => { ): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try { try {
const response = await axios.post(`/service/top_endpoints`, { const response = await axios.post(`/service/top_operations`, {
start: `${props.start}`, start: `${props.start}`,
end: `${props.end}`, end: `${props.end}`,
service: props.service, service: props.service,
@ -26,4 +26,4 @@ const getTopEndPoints = async (
} }
}; };
export default getTopEndPoints; export default getTopOperations;

View File

@ -5,6 +5,7 @@ import ROUTES from 'constants/routes';
import useComponentPermission from 'hooks/useComponentPermission'; import useComponentPermission from 'hooks/useComponentPermission';
import history from 'lib/history'; import history from 'lib/history';
import React, { useCallback, useState } from 'react'; import React, { useCallback, useState } from 'react';
import { useTranslation } from 'react-i18next';
import { useSelector } from 'react-redux'; import { useSelector } from 'react-redux';
import { generatePath } from 'react-router-dom'; import { generatePath } from 'react-router-dom';
import { AppState } from 'store/reducers'; import { AppState } from 'store/reducers';
@ -14,6 +15,7 @@ import AppReducer from 'types/reducer/app';
import Delete from './Delete'; import Delete from './Delete';
function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element { function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
const { t } = useTranslation(['channels']);
const [notifications, Element] = notification.useNotification(); const [notifications, Element] = notification.useNotification();
const [channels, setChannels] = useState<Channels[]>(allChannels); const [channels, setChannels] = useState<Channels[]>(allChannels);
const { role } = useSelector<AppState, AppReducer>((state) => state.app); const { role } = useSelector<AppState, AppReducer>((state) => state.app);
@ -29,12 +31,12 @@ function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
const columns: ColumnsType<Channels> = [ const columns: ColumnsType<Channels> = [
{ {
title: 'Name', title: t('column_channel_name'),
dataIndex: 'name', dataIndex: 'name',
key: 'name', key: 'name',
}, },
{ {
title: 'Type', title: t('column_channel_type'),
dataIndex: 'type', dataIndex: 'type',
key: 'type', key: 'type',
}, },
@ -42,14 +44,14 @@ function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
if (action) { if (action) {
columns.push({ columns.push({
title: 'Action', title: t('column_channel_action'),
dataIndex: 'id', dataIndex: 'id',
key: 'action', key: 'action',
align: 'center', align: 'center',
render: (id: string): JSX.Element => ( render: (id: string): JSX.Element => (
<> <>
<Button onClick={(): void => onClickEditHandler(id)} type="link"> <Button onClick={(): void => onClickEditHandler(id)} type="link">
Edit {t('column_channel_edit')}
</Button> </Button>
<Delete id={id} setChannels={setChannels} notifications={notifications} /> <Delete id={id} setChannels={setChannels} notifications={notifications} />
</> </>

View File

@ -1,29 +1,31 @@
import { Button } from 'antd'; import { Button } from 'antd';
import { NotificationInstance } from 'antd/lib/notification'; import { NotificationInstance } from 'antd/lib/notification';
import deleteAlert from 'api/channels/delete'; import deleteChannel from 'api/channels/delete';
import React, { useState } from 'react'; import React, { useState } from 'react';
import { useTranslation } from 'react-i18next';
import { Channels } from 'types/api/channels/getAll'; import { Channels } from 'types/api/channels/getAll';
function Delete({ notifications, setChannels, id }: DeleteProps): JSX.Element { function Delete({ notifications, setChannels, id }: DeleteProps): JSX.Element {
const { t } = useTranslation(['channels']);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const onClickHandler = async (): Promise<void> => { const onClickHandler = async (): Promise<void> => {
try { try {
setLoading(true); setLoading(true);
const response = await deleteAlert({ const response = await deleteChannel({
id, id,
}); });
if (response.statusCode === 200) { if (response.statusCode === 200) {
notifications.success({ notifications.success({
message: 'Success', message: 'Success',
description: 'Channel Deleted Successfully', description: t('channel_delete_success'),
}); });
setChannels((preChannels) => preChannels.filter((e) => e.id !== id)); setChannels((preChannels) => preChannels.filter((e) => e.id !== id));
} else { } else {
notifications.error({ notifications.error({
message: 'Error', message: 'Error',
description: response.error || 'Something went wrong', description: response.error || t('channel_delete_unexp_error'),
}); });
} }
setLoading(false); setLoading(false);
@ -31,7 +33,9 @@ function Delete({ notifications, setChannels, id }: DeleteProps): JSX.Element {
notifications.error({ notifications.error({
message: 'Error', message: 'Error',
description: description:
error instanceof Error ? error.toString() : 'Something went wrong', error instanceof Error
? error.toString()
: t('channel_delete_unexp_error'),
}); });
setLoading(false); setLoading(false);
} }

View File

@ -8,16 +8,18 @@ import useComponentPermission from 'hooks/useComponentPermission';
import useFetch from 'hooks/useFetch'; import useFetch from 'hooks/useFetch';
import history from 'lib/history'; import history from 'lib/history';
import React, { useCallback } from 'react'; import React, { useCallback } from 'react';
import { useTranslation } from 'react-i18next';
import { useSelector } from 'react-redux'; import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers'; import { AppState } from 'store/reducers';
import AppReducer from 'types/reducer/app'; import AppReducer from 'types/reducer/app';
import AlertChannelsComponent from './AlertChannels'; import AlertChannelsComponent from './AlertChannels';
import { Button, ButtonContainer } from './styles'; import { Button, ButtonContainer, RightActionContainer } from './styles';
const { Paragraph } = Typography; const { Paragraph } = Typography;
function AlertChannels(): JSX.Element { function AlertChannels(): JSX.Element {
const { t } = useTranslation(['channels']);
const { role } = useSelector<AppState, AppReducer>((state) => state.app); const { role } = useSelector<AppState, AppReducer>((state) => state.app);
const [addNewChannelPermission] = useComponentPermission( const [addNewChannelPermission] = useComponentPermission(
['add_new_channel'], ['add_new_channel'],
@ -34,28 +36,28 @@ function AlertChannels(): JSX.Element {
} }
if (loading || payload === undefined) { if (loading || payload === undefined) {
return <Spinner tip="Loading Channels.." height="90vh" />; return <Spinner tip={t('loading_channels_message')} height="90vh" />;
} }
return ( return (
<> <>
<ButtonContainer> <ButtonContainer>
<Paragraph ellipsis type="secondary"> <Paragraph ellipsis type="secondary">
The latest added channel is used as the default channel for sending alerts {t('sending_channels_note')}
</Paragraph> </Paragraph>
<div> <RightActionContainer>
<TextToolTip <TextToolTip
text="More details on how to setting notification channels" text={t('tooltip_notification_channels')}
url="https://signoz.io/docs/userguide/alerts-management/#setting-notification-channel" url="https://signoz.io/docs/userguide/alerts-management/#setting-notification-channel"
/> />
{addNewChannelPermission && ( {addNewChannelPermission && (
<Button onClick={onToggleHandler} icon={<PlusOutlined />}> <Button onClick={onToggleHandler} icon={<PlusOutlined />}>
New Alert Channel {t('button_new_channel')}
</Button> </Button>
)} )}
</div> </RightActionContainer>
</ButtonContainer> </ButtonContainer>
<AlertChannelsComponent allChannels={payload} /> <AlertChannelsComponent allChannels={payload} />

View File

@ -1,6 +1,13 @@
import { Button as ButtonComponent } from 'antd'; import { Button as ButtonComponent } from 'antd';
import styled from 'styled-components'; import styled from 'styled-components';
export const RightActionContainer = styled.div`
&&& {
display: flex;
align-items: center;
}
`;
export const ButtonContainer = styled.div` export const ButtonContainer = styled.div`
&&& { &&& {
display: flex; display: flex;

View File

@ -4,9 +4,12 @@ import React from 'react';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { AlertDef, Labels } from 'types/api/alerts/def'; import { AlertDef, Labels } from 'types/api/alerts/def';
import ChannelSelect from './ChannelSelect';
import LabelSelect from './labels'; import LabelSelect from './labels';
import { import {
ChannelSelectTip,
FormContainer, FormContainer,
FormItemMedium,
InputSmall, InputSmall,
SeveritySelect, SeveritySelect,
StepHeading, StepHeading,
@ -80,7 +83,7 @@ function BasicInfo({ alertDef, setAlertDef }: BasicInfoProps): JSX.Element {
}} }}
/> />
</FormItem> </FormItem>
<FormItem label={t('field_labels')}> <FormItemMedium label={t('field_labels')}>
<LabelSelect <LabelSelect
onSetLabels={(l: Labels): void => { onSetLabels={(l: Labels): void => {
setAlertDef({ setAlertDef({
@ -92,7 +95,19 @@ function BasicInfo({ alertDef, setAlertDef }: BasicInfoProps): JSX.Element {
}} }}
initialValues={alertDef.labels} initialValues={alertDef.labels}
/> />
</FormItem> </FormItemMedium>
<FormItemMedium label="Notification Channels">
<ChannelSelect
currentValue={alertDef.preferredChannels}
onSelectChannels={(s: string[]): void => {
setAlertDef({
...alertDef,
preferredChannels: s,
});
}}
/>
<ChannelSelectTip> {t('channel_select_tooltip')}</ChannelSelectTip>
</FormItemMedium>
</FormContainer> </FormContainer>
</> </>
); );

View File

@ -0,0 +1,70 @@
import { notification, Select } from 'antd';
import getChannels from 'api/channels/getAll';
import useFetch from 'hooks/useFetch';
import React from 'react';
import { useTranslation } from 'react-i18next';
import { StyledSelect } from './styles';
export interface ChannelSelectProps {
currentValue?: string[];
onSelectChannels: (s: string[]) => void;
}
function ChannelSelect({
currentValue,
onSelectChannels,
}: ChannelSelectProps): JSX.Element | null {
// init namespace for translations
const { t } = useTranslation('alerts');
const { loading, payload, error, errorMessage } = useFetch(getChannels);
const handleChange = (value: string[]): void => {
onSelectChannels(value);
};
if (error && errorMessage !== '') {
notification.error({
message: 'Error',
description: errorMessage,
});
}
const renderOptions = (): React.ReactNode[] => {
const children: React.ReactNode[] = [];
if (loading || payload === undefined || payload.length === 0) {
return children;
}
payload.forEach((o) => {
children.push(
<Select.Option key={o.id} value={o.name}>
{o.name}
</Select.Option>,
);
});
return children;
};
return (
<StyledSelect
status={error ? 'error' : ''}
mode="multiple"
style={{ width: '100%' }}
placeholder={t('placeholder_channel_select')}
value={currentValue}
onChange={(value): void => {
handleChange(value as string[]);
}}
optionLabelProp="label"
>
{renderOptions()}
</StyledSelect>
);
}
ChannelSelect.defaultProps = {
currentValue: [],
};
export default ChannelSelect;

View File

@ -0,0 +1,6 @@
import { Select } from 'antd';
import styled from 'styled-components';
export const StyledSelect = styled(Select)`
border-radius: 4px;
`;

View File

@ -21,7 +21,7 @@ export interface ChartPreviewProps {
selectedTime?: timePreferenceType; selectedTime?: timePreferenceType;
selectedInterval?: Time; selectedInterval?: Time;
headline?: JSX.Element; headline?: JSX.Element;
threshold?: number; threshold?: number | undefined;
} }
function ChartPreview({ function ChartPreview({
@ -35,7 +35,7 @@ function ChartPreview({
}: ChartPreviewProps): JSX.Element | null { }: ChartPreviewProps): JSX.Element | null {
const { t } = useTranslation('alerts'); const { t } = useTranslation('alerts');
const staticLine: StaticLineProps | undefined = const staticLine: StaticLineProps | undefined =
threshold && threshold > 0 threshold !== undefined
? { ? {
yMin: threshold, yMin: threshold,
yMax: threshold, yMax: threshold,
@ -66,8 +66,12 @@ function ChartPreview({
}), }),
enabled: enabled:
query != null && query != null &&
(query.queryType !== EQueryType.PROM || ((query.queryType === EQueryType.PROM &&
(query.promQL?.length > 0 && query.promQL[0].query !== '')), query.promQL?.length > 0 &&
query.promQL[0].query !== '') ||
(query.queryType === EQueryType.QUERY_BUILDER &&
query.metricsBuilder?.queryBuilder?.length > 0 &&
query.metricsBuilder?.queryBuilder[0].metricName !== '')),
}); });
const chartDataSet = queryResponse.isError const chartDataSet = queryResponse.isError
@ -113,7 +117,7 @@ ChartPreview.defaultProps = {
selectedTime: 'GLOBAL_TIME', selectedTime: 'GLOBAL_TIME',
selectedInterval: '5min', selectedInterval: '5min',
headline: undefined, headline: undefined,
threshold: 0, threshold: undefined,
}; };
export default ChartPreview; export default ChartPreview;

View File

@ -156,7 +156,9 @@ function RuleOptions({
...alertDef, ...alertDef,
condition: { condition: {
...alertDef.condition, ...alertDef.condition,
target: (value as number) || undefined, op: alertDef.condition?.op || defaultCompareOp,
matchType: alertDef.condition?.matchType || defaultMatchType,
target: value as number,
}, },
}); });
}} }}

View File

@ -1,6 +1,7 @@
import { ExclamationCircleOutlined, SaveOutlined } from '@ant-design/icons'; import { ExclamationCircleOutlined, SaveOutlined } from '@ant-design/icons';
import { FormInstance, Modal, notification, Typography } from 'antd'; import { FormInstance, Modal, notification, Typography } from 'antd';
import saveAlertApi from 'api/alerts/save'; import saveAlertApi from 'api/alerts/save';
import testAlertApi from 'api/alerts/testAlert';
import ROUTES from 'constants/routes'; import ROUTES from 'constants/routes';
import QueryTypeTag from 'container/NewWidget/LeftContainer/QueryTypeTag'; import QueryTypeTag from 'container/NewWidget/LeftContainer/QueryTypeTag';
import PlotTag from 'container/NewWidget/LeftContainer/WidgetGraph/PlotTag'; import PlotTag from 'container/NewWidget/LeftContainer/WidgetGraph/PlotTag';
@ -83,7 +84,7 @@ function FormAlertRules({
// staged query is used to display chart preview // staged query is used to display chart preview
const [stagedQuery, setStagedQuery] = useState<StagedQuery>(); const [stagedQuery, setStagedQuery] = useState<StagedQuery>();
const debouncedStagedQuery = useDebounce(stagedQuery, 500); const debouncedStagedQuery = useDebounce(stagedQuery, 1000);
// this use effect initiates staged query and // this use effect initiates staged query and
// other queries based on server data. // other queries based on server data.
@ -143,10 +144,74 @@ function FormAlertRules({
}); });
} }
}; };
const validatePromParams = useCallback((): boolean => {
let retval = true;
if (queryCategory !== EQueryType.PROM) return retval;
if (!promQueries || Object.keys(promQueries).length === 0) {
notification.error({
message: 'Error',
description: t('promql_required'),
});
return false;
}
Object.keys(promQueries).forEach((key) => {
if (promQueries[key].query === '') {
notification.error({
message: 'Error',
description: t('promql_required'),
});
retval = false;
}
});
return retval;
}, [t, promQueries, queryCategory]);
const validateQBParams = useCallback((): boolean => {
let retval = true;
if (queryCategory !== EQueryType.QUERY_BUILDER) return true;
if (!metricQueries || Object.keys(metricQueries).length === 0) {
notification.error({
message: 'Error',
description: t('condition_required'),
});
return false;
}
if (!alertDef.condition?.target) {
notification.error({
message: 'Error',
description: t('target_missing'),
});
return false;
}
Object.keys(metricQueries).forEach((key) => {
if (metricQueries[key].metricName === '') {
notification.error({
message: 'Error',
description: t('metricname_missing', { where: metricQueries[key].name }),
});
retval = false;
}
});
Object.keys(formulaQueries).forEach((key) => {
if (formulaQueries[key].expression === '') {
notification.error({
message: 'Error',
description: t('expression_missing', formulaQueries[key].name),
});
retval = false;
}
});
return retval;
}, [t, alertDef, queryCategory, metricQueries, formulaQueries]);
const isFormValid = useCallback((): boolean => { const isFormValid = useCallback((): boolean => {
let retval = true;
if (!alertDef.alert || alertDef.alert === '') { if (!alertDef.alert || alertDef.alert === '') {
notification.error({ notification.error({
message: 'Error', message: 'Error',
@ -155,56 +220,14 @@ function FormAlertRules({
return false; return false;
} }
if ( if (!validatePromParams()) {
queryCategory === EQueryType.PROM &&
(!promQueries || Object.keys(promQueries).length === 0)
) {
notification.error({
message: 'Error',
description: t('promql_required'),
});
return false; return false;
} }
if ( return validateQBParams();
(queryCategory === EQueryType.QUERY_BUILDER && !metricQueries) || }, [t, validateQBParams, alertDef, validatePromParams]);
Object.keys(metricQueries).length === 0
) {
notification.error({
message: 'Error',
description: t('condition_required'),
});
return false;
}
Object.keys(metricQueries).forEach((key) => {
if (metricQueries[key].metricName === '') {
retval = false;
notification.error({
message: 'Error',
description: t('metricname_missing', { where: metricQueries[key].name }),
});
}
});
Object.keys(formulaQueries).forEach((key) => {
if (formulaQueries[key].expression === '') {
retval = false;
notification.error({
message: 'Error',
description: t('expression_missing', formulaQueries[key].name),
});
}
});
return retval;
}, [t, alertDef, queryCategory, metricQueries, formulaQueries, promQueries]);
const saveRule = useCallback(async () => {
if (!isFormValid()) {
return;
}
const preparePostData = (): AlertDef => {
const postableAlert: AlertDef = { const postableAlert: AlertDef = {
...alertDef, ...alertDef,
source: window?.location.toString(), source: window?.location.toString(),
@ -219,6 +242,22 @@ function FormAlertRules({
}, },
}, },
}; };
return postableAlert;
};
const memoizedPreparePostData = useCallback(preparePostData, [
queryCategory,
alertDef,
metricQueries,
formulaQueries,
promQueries,
]);
const saveRule = useCallback(async () => {
if (!isFormValid()) {
return;
}
const postableAlert = memoizedPreparePostData();
setLoading(true); setLoading(true);
try { try {
@ -235,7 +274,7 @@ function FormAlertRules({
description: description:
!ruleId || ruleId === 0 ? t('rule_created') : t('rule_edited'), !ruleId || ruleId === 0 ? t('rule_created') : t('rule_edited'),
}); });
console.log('invalidting cache');
// invalidate rule in cache // invalidate rule in cache
ruleCache.invalidateQueries(['ruleId', ruleId]); ruleCache.invalidateQueries(['ruleId', ruleId]);
@ -249,24 +288,13 @@ function FormAlertRules({
}); });
} }
} catch (e) { } catch (e) {
console.log('save alert api failed:', e);
notification.error({ notification.error({
message: 'Error', message: 'Error',
description: t('unexpected_error'), description: t('unexpected_error'),
}); });
} }
setLoading(false); setLoading(false);
}, [ }, [t, isFormValid, ruleId, ruleCache, memoizedPreparePostData]);
t,
isFormValid,
queryCategory,
ruleId,
alertDef,
metricQueries,
formulaQueries,
promQueries,
ruleCache,
]);
const onSaveHandler = useCallback(async () => { const onSaveHandler = useCallback(async () => {
const content = ( const content = (
@ -287,6 +315,44 @@ function FormAlertRules({
}); });
}, [t, saveRule, queryCategory]); }, [t, saveRule, queryCategory]);
const onTestRuleHandler = useCallback(async () => {
if (!isFormValid()) {
return;
}
const postableAlert = memoizedPreparePostData();
setLoading(true);
try {
const response = await testAlertApi({ data: postableAlert });
if (response.statusCode === 200) {
const { payload } = response;
if (payload?.alertCount === 0) {
notification.error({
message: 'Error',
description: t('no_alerts_found'),
});
} else {
notification.success({
message: 'Success',
description: t('rule_test_fired'),
});
}
} else {
notification.error({
message: 'Error',
description: response.error || t('unexpected_error'),
});
}
} catch (e) {
notification.error({
message: 'Error',
description: t('unexpected_error'),
});
}
setLoading(false);
}, [t, isFormValid, memoizedPreparePostData]);
const renderBasicInfo = (): JSX.Element => ( const renderBasicInfo = (): JSX.Element => (
<BasicInfo alertDef={alertDef} setAlertDef={setAlertDef} /> <BasicInfo alertDef={alertDef} setAlertDef={setAlertDef} />
); );
@ -353,6 +419,14 @@ function FormAlertRules({
> >
{ruleId > 0 ? t('button_savechanges') : t('button_createrule')} {ruleId > 0 ? t('button_savechanges') : t('button_createrule')}
</ActionButton> </ActionButton>
<ActionButton
loading={loading || false}
type="default"
onClick={onTestRuleHandler}
>
{' '}
{t('button_testrule')}
</ActionButton>
<ActionButton <ActionButton
disabled={loading || false} disabled={loading || false}
type="default" type="default"

View File

@ -8,8 +8,7 @@ interface SearchContainerProps {
} }
export const SearchContainer = styled.div<SearchContainerProps>` export const SearchContainer = styled.div<SearchContainerProps>`
width: 70%; border-radius: 4px;
border-radisu: 4px;
background: ${({ isDarkMode }): string => (isDarkMode ? '#000' : '#fff')}; background: ${({ isDarkMode }): string => (isDarkMode ? '#000' : '#fff')};
flex: 1; flex: 1;
display: flex; display: flex;

View File

@ -1,4 +1,15 @@
import { Button, Card, Col, Form, Input, InputNumber, Row, Select } from 'antd'; import {
Button,
Card,
Col,
Form,
Input,
InputNumber,
Row,
Select,
Typography,
} from 'antd';
import FormItem from 'antd/lib/form/FormItem';
import TextArea from 'antd/lib/input/TextArea'; import TextArea from 'antd/lib/input/TextArea';
import styled from 'styled-components'; import styled from 'styled-components';
@ -67,21 +78,19 @@ export const InlineSelect = styled(Select)`
`; `;
export const SeveritySelect = styled(Select)` export const SeveritySelect = styled(Select)`
width: 15% !important; width: 25% !important;
`; `;
export const InputSmall = styled(Input)` export const InputSmall = styled(Input)`
width: 40% !important; width: 40% !important;
`; `;
export const FormContainer = styled.div` export const FormContainer = styled(Card)`
padding: 2em; padding: 2em;
margin-top: 1rem; margin-top: 1rem;
display: flex; display: flex;
flex-direction: column; flex-direction: column;
background: #141414;
border-radius: 4px; border-radius: 4px;
border: 1px solid #303030;
`; `;
export const ThresholdInput = styled(InputNumber)` export const ThresholdInput = styled(InputNumber)`
@ -101,3 +110,11 @@ export const ThresholdInput = styled(InputNumber)`
export const TextareaMedium = styled(TextArea)` export const TextareaMedium = styled(TextArea)`
width: 70%; width: 70%;
`; `;
export const FormItemMedium = styled(FormItem)`
width: 70%;
`;
export const ChannelSelectTip = styled(Typography.Text)`
color: hsla(0, 0%, 100%, 0.3);
`;

View File

@ -1,10 +1,11 @@
import { Button } from 'antd';
import { NotificationInstance } from 'antd/lib/notification/index'; import { NotificationInstance } from 'antd/lib/notification/index';
import deleteAlerts from 'api/alerts/delete'; import deleteAlerts from 'api/alerts/delete';
import { State } from 'hooks/useFetch'; import { State } from 'hooks/useFetch';
import React, { useState } from 'react'; import React, { useState } from 'react';
import { PayloadProps as DeleteAlertPayloadProps } from 'types/api/alerts/delete'; import { PayloadProps as DeleteAlertPayloadProps } from 'types/api/alerts/delete';
import { Alerts } from 'types/api/alerts/getAll'; import { GettableAlert } from 'types/api/alerts/get';
import { ColumnButton } from './styles';
function DeleteAlert({ function DeleteAlert({
id, id,
@ -72,20 +73,20 @@ function DeleteAlert({
}; };
return ( return (
<Button <ColumnButton
disabled={deleteAlertState.loading || false} disabled={deleteAlertState.loading || false}
loading={deleteAlertState.loading || false} loading={deleteAlertState.loading || false}
onClick={(): Promise<void> => onDeleteHandler(id)} onClick={(): Promise<void> => onDeleteHandler(id)}
type="link" type="link"
> >
Delete Delete
</Button> </ColumnButton>
); );
} }
interface DeleteAlertProps { interface DeleteAlertProps {
id: Alerts['id']; id: GettableAlert['id'];
setData: React.Dispatch<React.SetStateAction<Alerts[]>>; setData: React.Dispatch<React.SetStateAction<GettableAlert[]>>;
notifications: NotificationInstance; notifications: NotificationInstance;
} }

View File

@ -1,6 +1,6 @@
/* eslint-disable react/display-name */ /* eslint-disable react/display-name */
import { PlusOutlined } from '@ant-design/icons'; import { PlusOutlined } from '@ant-design/icons';
import { notification, Tag, Typography } from 'antd'; import { notification, Typography } from 'antd';
import Table, { ColumnsType } from 'antd/lib/table'; import Table, { ColumnsType } from 'antd/lib/table';
import TextToolTip from 'components/TextToolTip'; import TextToolTip from 'components/TextToolTip';
import ROUTES from 'constants/routes'; import ROUTES from 'constants/routes';
@ -13,15 +13,16 @@ import { UseQueryResult } from 'react-query';
import { useSelector } from 'react-redux'; import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers'; import { AppState } from 'store/reducers';
import { ErrorResponse, SuccessResponse } from 'types/api'; import { ErrorResponse, SuccessResponse } from 'types/api';
import { Alerts } from 'types/api/alerts/getAll'; import { GettableAlert } from 'types/api/alerts/get';
import AppReducer from 'types/reducer/app'; import AppReducer from 'types/reducer/app';
import DeleteAlert from './DeleteAlert'; import DeleteAlert from './DeleteAlert';
import { Button, ButtonContainer } from './styles'; import { Button, ButtonContainer, ColumnButton, StyledTag } from './styles';
import Status from './TableComponents/Status'; import Status from './TableComponents/Status';
import ToggleAlertState from './ToggleAlertState';
function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element { function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
const [data, setData] = useState<Alerts[]>(allAlertRules || []); const [data, setData] = useState<GettableAlert[]>(allAlertRules || []);
const { t } = useTranslation('common'); const { t } = useTranslation('common');
const { role } = useSelector<AppState, AppReducer>((state) => state.app); const { role } = useSelector<AppState, AppReducer>((state) => state.app);
const [addNewAlert, action] = useComponentPermission( const [addNewAlert, action] = useComponentPermission(
@ -53,22 +54,27 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
history.push(`${ROUTES.EDIT_ALERTS}?ruleId=${id}`); history.push(`${ROUTES.EDIT_ALERTS}?ruleId=${id}`);
}; };
const columns: ColumnsType<Alerts> = [ const columns: ColumnsType<GettableAlert> = [
{ {
title: 'Status', title: 'Status',
dataIndex: 'state', dataIndex: 'state',
key: 'state', key: 'state',
sorter: (a, b): number => sorter: (a, b): number =>
b.labels.severity.length - a.labels.severity.length, (b.state ? b.state.charCodeAt(0) : 1000) -
(a.state ? a.state.charCodeAt(0) : 1000),
render: (value): JSX.Element => <Status status={value} />, render: (value): JSX.Element => <Status status={value} />,
}, },
{ {
title: 'Alert Name', title: 'Alert Name',
dataIndex: 'alert', dataIndex: 'alert',
key: 'name', key: 'name',
sorter: (a, b): number => a.name.charCodeAt(0) - b.name.charCodeAt(0), sorter: (a, b): number =>
(a.alert ? a.alert.charCodeAt(0) : 1000) -
(b.alert ? b.alert.charCodeAt(0) : 1000),
render: (value, record): JSX.Element => ( render: (value, record): JSX.Element => (
<Typography.Link onClick={(): void => onEditHandler(record.id.toString())}> <Typography.Link
onClick={(): void => onEditHandler(record.id ? record.id.toString() : '')}
>
{value} {value}
</Typography.Link> </Typography.Link>
), ),
@ -78,7 +84,8 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
dataIndex: 'labels', dataIndex: 'labels',
key: 'severity', key: 'severity',
sorter: (a, b): number => sorter: (a, b): number =>
a.labels.severity.length - b.labels.severity.length, (a.labels ? a.labels.severity.length : 0) -
(b.labels ? b.labels.severity.length : 0),
render: (value): JSX.Element => { render: (value): JSX.Element => {
const objectKeys = Object.keys(value); const objectKeys = Object.keys(value);
const withSeverityKey = objectKeys.find((e) => e === 'severity') || ''; const withSeverityKey = objectKeys.find((e) => e === 'severity') || '';
@ -92,6 +99,7 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
dataIndex: 'labels', dataIndex: 'labels',
key: 'tags', key: 'tags',
align: 'center', align: 'center',
width: 350,
render: (value): JSX.Element => { render: (value): JSX.Element => {
const objectKeys = Object.keys(value); const objectKeys = Object.keys(value);
const withOutSeverityKeys = objectKeys.filter((e) => e !== 'severity'); const withOutSeverityKeys = objectKeys.filter((e) => e !== 'severity');
@ -104,9 +112,9 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
<> <>
{withOutSeverityKeys.map((e) => { {withOutSeverityKeys.map((e) => {
return ( return (
<Tag key={e} color="magenta"> <StyledTag key={e} color="magenta">
{e}: {value[e]} {e}: {value[e]}
</Tag> </StyledTag>
); );
})} })}
</> </>
@ -120,14 +128,19 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
title: 'Action', title: 'Action',
dataIndex: 'id', dataIndex: 'id',
key: 'action', key: 'action',
render: (id: Alerts['id']): JSX.Element => { render: (id: GettableAlert['id'], record): JSX.Element => {
return ( return (
<> <>
<DeleteAlert notifications={notifications} setData={setData} id={id} /> <ToggleAlertState disabled={record.disabled} setData={setData} id={id} />
<Button onClick={(): void => onEditHandler(id.toString())} type="link"> <ColumnButton
onClick={(): void => onEditHandler(id.toString())}
type="link"
>
Edit Edit
</Button> </ColumnButton>
<DeleteAlert notifications={notifications} setData={setData} id={id} />
</> </>
); );
}, },
@ -159,8 +172,10 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
} }
interface ListAlertProps { interface ListAlertProps {
allAlertRules: Alerts[]; allAlertRules: GettableAlert[];
refetch: UseQueryResult<ErrorResponse | SuccessResponse<Alerts[]>>['refetch']; refetch: UseQueryResult<
ErrorResponse | SuccessResponse<GettableAlert[]>
>['refetch'];
} }
export default ListAlert; export default ListAlert;

View File

@ -1,6 +1,6 @@
import { Tag } from 'antd'; import { Tag } from 'antd';
import React from 'react'; import React from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { GettableAlert } from 'types/api/alerts/get';
function Status({ status }: StatusProps): JSX.Element { function Status({ status }: StatusProps): JSX.Element {
switch (status) { switch (status) {
@ -16,14 +16,18 @@ function Status({ status }: StatusProps): JSX.Element {
return <Tag color="red">Firing</Tag>; return <Tag color="red">Firing</Tag>;
} }
case 'disabled': {
return <Tag>Disabled</Tag>;
}
default: { default: {
return <Tag color="default">Unknown Status</Tag>; return <Tag color="default">Unknown</Tag>;
} }
} }
} }
interface StatusProps { interface StatusProps {
status: Alerts['state']; status: GettableAlert['state'];
} }
export default Status; export default Status;

View File

@ -0,0 +1,108 @@
import { notification } from 'antd';
import patchAlert from 'api/alerts/patch';
import { State } from 'hooks/useFetch';
import React, { useState } from 'react';
import { GettableAlert } from 'types/api/alerts/get';
import { PayloadProps as PatchPayloadProps } from 'types/api/alerts/patch';
import { ColumnButton } from './styles';
function ToggleAlertState({
id,
disabled,
setData,
}: ToggleAlertStateProps): JSX.Element {
const [apiStatus, setAPIStatus] = useState<State<PatchPayloadProps>>({
error: false,
errorMessage: '',
loading: false,
success: false,
payload: undefined,
});
const defaultErrorMessage = 'Something went wrong';
const onToggleHandler = async (
id: number,
disabled: boolean,
): Promise<void> => {
try {
setAPIStatus((state) => ({
...state,
loading: true,
}));
const response = await patchAlert({
id,
data: {
disabled,
},
});
if (response.statusCode === 200) {
setData((state) => {
return state.map((alert) => {
if (alert.id === id) {
return {
...alert,
disabled: response.payload.disabled,
state: response.payload.state,
};
}
return alert;
});
});
setAPIStatus((state) => ({
...state,
loading: false,
payload: response.payload,
}));
notification.success({
message: 'Success',
});
} else {
setAPIStatus((state) => ({
...state,
loading: false,
error: true,
errorMessage: response.error || defaultErrorMessage,
}));
notification.error({
message: response.error || defaultErrorMessage,
});
}
} catch (error) {
setAPIStatus((state) => ({
...state,
loading: false,
error: true,
errorMessage: defaultErrorMessage,
}));
notification.error({
message: defaultErrorMessage,
});
}
};
return (
<ColumnButton
disabled={apiStatus.loading || false}
loading={apiStatus.loading || false}
onClick={(): Promise<void> => onToggleHandler(id, !disabled)}
type="link"
>
{disabled ? 'Enable' : 'Disable'}
</ColumnButton>
);
}
interface ToggleAlertStateProps {
id: GettableAlert['id'];
disabled: boolean;
setData: React.Dispatch<React.SetStateAction<GettableAlert[]>>;
}
export default ToggleAlertState;

View File

@ -1,4 +1,4 @@
import { Button as ButtonComponent } from 'antd'; import { Button as ButtonComponent, Tag } from 'antd';
import styled from 'styled-components'; import styled from 'styled-components';
export const ButtonContainer = styled.div` export const ButtonContainer = styled.div`
@ -12,6 +12,20 @@ export const ButtonContainer = styled.div`
export const Button = styled(ButtonComponent)` export const Button = styled(ButtonComponent)`
&&& { &&& {
margin-left: 1rem; margin-left: 1em;
}
`;
export const ColumnButton = styled(ButtonComponent)`
&&& {
padding-left: 0;
padding-right: 0;
margin-right: 1.5em;
}
`;
export const StyledTag = styled(Tag)`
&&& {
white-space: normal;
} }
`; `;

View File

@ -15,7 +15,7 @@ import { PromQLWidgets } from 'types/api/dashboard/getAll';
import MetricReducer from 'types/reducer/metrics'; import MetricReducer from 'types/reducer/metrics';
import { Card, Col, GraphContainer, GraphTitle, Row } from '../styles'; import { Card, Col, GraphContainer, GraphTitle, Row } from '../styles';
import TopEndpointsTable from '../TopEndpointsTable'; import TopOperationsTable from '../TopOperationsTable';
import { Button } from './styles'; import { Button } from './styles';
function Application({ getWidget }: DashboardProps): JSX.Element { function Application({ getWidget }: DashboardProps): JSX.Element {
@ -23,11 +23,13 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
const selectedTimeStamp = useRef(0); const selectedTimeStamp = useRef(0);
const { const {
topEndPoints, topOperations,
serviceOverview, serviceOverview,
resourceAttributePromQLQuery, resourceAttributePromQLQuery,
resourceAttributeQueries, resourceAttributeQueries,
topLevelOperations,
} = useSelector<AppState, MetricReducer>((state) => state.metrics); } = useSelector<AppState, MetricReducer>((state) => state.metrics);
const operationsRegex = topLevelOperations.join('|');
const selectedTraceTags: string = JSON.stringify( const selectedTraceTags: string = JSON.stringify(
convertRawQueriesToTraceSelectedTags(resourceAttributeQueries, 'array') || [], convertRawQueriesToTraceSelectedTags(resourceAttributeQueries, 'array') || [],
@ -107,7 +109,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
<Button <Button
type="default" type="default"
size="small" size="small"
id="Application_button" id="Service_button"
onClick={(): void => { onClick={(): void => {
onTracePopupClick(selectedTimeStamp.current); onTracePopupClick(selectedTimeStamp.current);
}} }}
@ -115,13 +117,13 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
View Traces View Traces
</Button> </Button>
<Card> <Card>
<GraphTitle>Application latency</GraphTitle> <GraphTitle>Latency</GraphTitle>
<GraphContainer> <GraphContainer>
<Graph <Graph
onClickHandler={(ChartEvent, activeElements, chart, data): void => { onClickHandler={(ChartEvent, activeElements, chart, data): void => {
onClickHandler(ChartEvent, activeElements, chart, data, 'Application'); onClickHandler(ChartEvent, activeElements, chart, data, 'Service');
}} }}
name="application_latency" name="service_latency"
type="line" type="line"
data={{ data={{
datasets: [ datasets: [
@ -175,7 +177,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
<Button <Button
type="default" type="default"
size="small" size="small"
id="Request_button" id="Rate_button"
onClick={(): void => { onClick={(): void => {
onTracePopupClick(selectedTimeStamp.current); onTracePopupClick(selectedTimeStamp.current);
}} }}
@ -183,21 +185,21 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
View Traces View Traces
</Button> </Button>
<Card> <Card>
<GraphTitle>Requests</GraphTitle> <GraphTitle>Rate (ops/s)</GraphTitle>
<GraphContainer> <GraphContainer>
<FullView <FullView
name="request_per_sec" name="operations_per_sec"
fullViewOptions={false} fullViewOptions={false}
onClickHandler={(event, element, chart, data): void => { onClickHandler={(event, element, chart, data): void => {
onClickHandler(event, element, chart, data, 'Request'); onClickHandler(event, element, chart, data, 'Rate');
}} }}
widget={getWidget([ widget={getWidget([
{ {
query: `sum(rate(signoz_latency_count{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))`, query: `sum(rate(signoz_latency_count{service_name="${servicename}", operation=~"${operationsRegex}"${resourceAttributePromQLQuery}}[5m]))`,
legend: 'Requests', legend: 'Operations',
}, },
])} ])}
yAxisUnit="reqps" yAxisUnit="ops"
/> />
</GraphContainer> </GraphContainer>
</Card> </Card>
@ -227,7 +229,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
}} }}
widget={getWidget([ widget={getWidget([
{ {
query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))) < 1000 OR vector(0)`, query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}"${resourceAttributePromQLQuery}}[5m]))) < 1000 OR vector(0)`,
legend: 'Error Percentage', legend: 'Error Percentage',
}, },
])} ])}
@ -239,7 +241,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
<Col span={12}> <Col span={12}>
<Card> <Card>
<TopEndpointsTable data={topEndPoints} /> <TopOperationsTable data={topOperations} />
</Card> </Card>
</Col> </Col>
</Row> </Row>

View File

@ -11,7 +11,7 @@ import { AppState } from 'store/reducers';
import { GlobalReducer } from 'types/reducer/globalTime'; import { GlobalReducer } from 'types/reducer/globalTime';
import MetricReducer from 'types/reducer/metrics'; import MetricReducer from 'types/reducer/metrics';
function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element { function TopOperationsTable(props: TopOperationsTableProps): JSX.Element {
const { minTime, maxTime } = useSelector<AppState, GlobalReducer>( const { minTime, maxTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime, (state) => state.globalTime,
); );
@ -85,7 +85,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
title: 'Number of Calls', title: 'Number of Calls',
dataIndex: 'numCalls', dataIndex: 'numCalls',
key: 'numCalls', key: 'numCalls',
sorter: (a: TopEndpointListItem, b: TopEndpointListItem): number => sorter: (a: TopOperationListItem, b: TopOperationListItem): number =>
a.numCalls - b.numCalls, a.numCalls - b.numCalls,
}, },
]; ];
@ -94,7 +94,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
<Table <Table
showHeader showHeader
title={(): string => { title={(): string => {
return 'Top Endpoints'; return 'Key Operations';
}} }}
tableLayout="fixed" tableLayout="fixed"
dataSource={data} dataSource={data}
@ -104,7 +104,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
); );
} }
interface TopEndpointListItem { interface TopOperationListItem {
p50: number; p50: number;
p95: number; p95: number;
p99: number; p99: number;
@ -112,10 +112,10 @@ interface TopEndpointListItem {
name: string; name: string;
} }
type DataProps = TopEndpointListItem; type DataProps = TopOperationListItem;
interface TopEndpointsTableProps { interface TopOperationsTableProps {
data: TopEndpointListItem[]; data: TopOperationListItem[];
} }
export default TopEndpointsTable; export default TopOperationsTable;

View File

@ -56,14 +56,14 @@ function Metrics(): JSX.Element {
render: (value: number): string => (value / 1000000).toFixed(2), render: (value: number): string => (value / 1000000).toFixed(2),
}, },
{ {
title: 'Error Rate (% of requests)', title: 'Error Rate (% of total)',
dataIndex: 'errorRate', dataIndex: 'errorRate',
key: 'errorRate', key: 'errorRate',
sorter: (a: DataProps, b: DataProps): number => a.errorRate - b.errorRate, sorter: (a: DataProps, b: DataProps): number => a.errorRate - b.errorRate,
render: (value: number): string => value.toFixed(2), render: (value: number): string => value.toFixed(2),
}, },
{ {
title: 'Requests Per Second', title: 'Operations Per Second',
dataIndex: 'callRate', dataIndex: 'callRate',
key: 'callRate', key: 'callRate',
sorter: (a: DataProps, b: DataProps): number => a.callRate - b.callRate, sorter: (a: DataProps, b: DataProps): number => a.callRate - b.callRate,

View File

@ -42,8 +42,9 @@ export interface Option {
} }
export const ServiceMapOptions: Option[] = [ export const ServiceMapOptions: Option[] = [
{ value: '1min', label: 'Last 1 min' },
{ value: '5min', label: 'Last 5 min' }, { value: '5min', label: 'Last 5 min' },
{ value: '15min', label: 'Last 15 min' },
{ value: '30min', label: 'Last 30 min' },
]; ];
export const getDefaultOption = (route: string): Time => { export const getDefaultOption = (route: string): Time => {

View File

@ -2,7 +2,7 @@
import type { SelectProps } from 'antd'; import type { SelectProps } from 'antd';
import { Tag } from 'antd'; import { Tag } from 'antd';
import React, { useCallback, useMemo } from 'react'; import React, { useCallback, useMemo } from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import { Container, Select } from './styles'; import { Container, Select } from './styles';

View File

@ -2,7 +2,7 @@ import { Tag, Typography } from 'antd';
import convertDateToAmAndPm from 'lib/convertDateToAmAndPm'; import convertDateToAmAndPm from 'lib/convertDateToAmAndPm';
import getFormattedDate from 'lib/getFormatedDate'; import getFormattedDate from 'lib/getFormatedDate';
import React from 'react'; import React from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import Status from '../TableComponents/AlertStatus'; import Status from '../TableComponents/AlertStatus';
import { TableCell, TableRow } from './styles'; import { TableCell, TableRow } from './styles';

View File

@ -1,7 +1,7 @@
import { MinusSquareOutlined, PlusSquareOutlined } from '@ant-design/icons'; import { MinusSquareOutlined, PlusSquareOutlined } from '@ant-design/icons';
import { Tag } from 'antd'; import { Tag } from 'antd';
import React, { useState } from 'react'; import React, { useState } from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import ExapandableRow from './ExapandableRow'; import ExapandableRow from './ExapandableRow';
import { IconContainer, StatusContainer, TableCell, TableRow } from './styles'; import { IconContainer, StatusContainer, TableCell, TableRow } from './styles';

View File

@ -1,6 +1,6 @@
import groupBy from 'lodash-es/groupBy'; import groupBy from 'lodash-es/groupBy';
import React, { useMemo } from 'react'; import React, { useMemo } from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import { Value } from '../Filter'; import { Value } from '../Filter';
import { FilterAlerts } from '../utils'; import { FilterAlerts } from '../utils';

View File

@ -5,7 +5,7 @@ import AlertStatus from 'container/TriggeredAlerts/TableComponents/AlertStatus';
import convertDateToAmAndPm from 'lib/convertDateToAmAndPm'; import convertDateToAmAndPm from 'lib/convertDateToAmAndPm';
import getFormattedDate from 'lib/getFormatedDate'; import getFormattedDate from 'lib/getFormatedDate';
import React from 'react'; import React from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import { Value } from './Filter'; import { Value } from './Filter';
import { FilterAlerts } from './utils'; import { FilterAlerts } from './utils';

View File

@ -1,7 +1,7 @@
import getTriggeredApi from 'api/alerts/getTriggered'; import getTriggeredApi from 'api/alerts/getTriggered';
import useInterval from 'hooks/useInterval'; import useInterval from 'hooks/useInterval';
import React, { useState } from 'react'; import React, { useState } from 'react';
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import Filter, { Value } from './Filter'; import Filter, { Value } from './Filter';
import FilteredTable from './FilteredTable'; import FilteredTable from './FilteredTable';

View File

@ -1,4 +1,4 @@
import { Alerts } from 'types/api/alerts/getAll'; import { Alerts } from 'types/api/alerts/getTriggered';
import { Value } from './Filter'; import { Value } from './Filter';

View File

@ -45,6 +45,9 @@ interface graphLink {
source: string; source: string;
target: string; target: string;
value: number; value: number;
callRate: number;
errorRate: number;
p99: number;
} }
export interface graphDataType { export interface graphDataType {
nodes: graphNode[]; nodes: graphNode[];
@ -96,16 +99,16 @@ function ServiceMap(props: ServiceMapProps): JSX.Element {
const graphData = { nodes, links }; const graphData = { nodes, links };
return ( return (
<Container> <Container>
<SelectService {/* <SelectService
services={serviceMap.services} services={serviceMap.items}
zoomToService={zoomToService} zoomToService={zoomToService}
zoomToDefault={zoomToDefault} zoomToDefault={zoomToDefault}
/> /> */}
<ForceGraph2D <ForceGraph2D
ref={fgRef} ref={fgRef}
cooldownTicks={100} cooldownTicks={100}
graphData={graphData} graphData={graphData}
nodeLabel={getTooltip} linkLabel={getTooltip}
linkAutoColorBy={(d) => d.target} linkAutoColorBy={(d) => d.target}
linkDirectionalParticles="value" linkDirectionalParticles="value"
linkDirectionalParticleSpeed={(d) => d.value} linkDirectionalParticleSpeed={(d) => d.value}
@ -124,7 +127,7 @@ function ServiceMap(props: ServiceMapProps): JSX.Element {
ctx.fillStyle = isDarkMode ? '#ffffff' : '#000000'; ctx.fillStyle = isDarkMode ? '#ffffff' : '#000000';
ctx.fillText(label, node.x, node.y); ctx.fillText(label, node.x, node.y);
}} }}
onNodeClick={(node) => { onLinkHover={(node) => {
const tooltip = document.querySelector('.graph-tooltip'); const tooltip = document.querySelector('.graph-tooltip');
if (tooltip && node) { if (tooltip && node) {
tooltip.innerHTML = getTooltip(node); tooltip.innerHTML = getTooltip(node);

View File

@ -1,12 +1,13 @@
/*eslint-disable*/ /*eslint-disable*/
//@ts-nocheck //@ts-nocheck
import { cloneDeep, find, maxBy, uniq, uniqBy } from 'lodash-es'; import { cloneDeep, find, maxBy, uniq, uniqBy, groupBy, sumBy } from 'lodash-es';
import { graphDataType } from './ServiceMap'; import { graphDataType } from './ServiceMap';
const MIN_WIDTH = 10; const MIN_WIDTH = 10;
const MAX_WIDTH = 20; const MAX_WIDTH = 20;
const DEFAULT_FONT_SIZE = 6; const DEFAULT_FONT_SIZE = 6;
export const getDimensions = (num, highest) => { export const getDimensions = (num, highest) => {
const percentage = (num / highest) * 100; const percentage = (num / highest) * 100;
const width = (percentage * (MAX_WIDTH - MIN_WIDTH)) / 100 + MIN_WIDTH; const width = (percentage * (MAX_WIDTH - MIN_WIDTH)) / 100 + MIN_WIDTH;
@ -18,19 +19,30 @@ export const getDimensions = (num, highest) => {
}; };
export const getGraphData = (serviceMap, isDarkMode): graphDataType => { export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
const { items, services } = serviceMap; const { items } = serviceMap;
const services = Object.values(groupBy(items, 'child')).map((e) => {
return {
serviceName: e[0].child,
errorRate: sumBy(e, 'errorRate'),
callRate: sumBy(e, 'callRate'),
}
});
const highestCallCount = maxBy(items, (e) => e?.callCount)?.callCount; const highestCallCount = maxBy(items, (e) => e?.callCount)?.callCount;
const highestCallRate = maxBy(services, (e) => e?.callRate)?.callRate; const highestCallRate = maxBy(services, (e) => e?.callRate)?.callRate;
const divNum = Number( const divNum = Number(
String(1).padEnd(highestCallCount.toString().length, '0'), String(1).padEnd(highestCallCount.toString().length, '0'),
); );
const links = cloneDeep(items).map((node) => { const links = cloneDeep(items).map((node) => {
const { parent, child, callCount } = node; const { parent, child, callCount, callRate, errorRate, p99 } = node;
return { return {
source: parent, source: parent,
target: child, target: child,
value: (100 - callCount / divNum) * 0.03, value: (100 - callCount / divNum) * 0.03,
callRate,
errorRate,
p99,
}; };
}); });
const uniqParent = uniqBy(cloneDeep(items), 'parent').map((e) => e.parent); const uniqParent = uniqBy(cloneDeep(items), 'parent').map((e) => e.parent);
@ -47,15 +59,10 @@ export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
width: MIN_WIDTH, width: MIN_WIDTH,
color, color,
nodeVal: MIN_WIDTH, nodeVal: MIN_WIDTH,
callRate: 0,
errorRate: 0,
p99: 0,
}; };
} }
if (service.errorRate > 0) { if (service.errorRate > 0) {
color = isDarkMode ? '#DB836E' : '#F98989'; color = isDarkMode ? '#DB836E' : '#F98989';
} else if (service.fourXXRate > 0) {
color = isDarkMode ? '#C79931' : '#F9DA7B';
} }
const { fontSize, width } = getDimensions(service.callRate, highestCallRate); const { fontSize, width } = getDimensions(service.callRate, highestCallRate);
return { return {
@ -65,9 +72,6 @@ export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
width, width,
color, color,
nodeVal: width, nodeVal: width,
callRate: service.callRate.toFixed(2),
errorRate: service.errorRate,
p99: service.p99,
}; };
}); });
return { return {
@ -90,25 +94,31 @@ export const getZoomPx = (): number => {
return 190; return 190;
}; };
export const getTooltip = (node: { const getRound2DigitsAfterDecimal = (num: number) => {
if (num === 0) {
return 0;
}
return num.toFixed(20).match(/^-?\d*\.?0*\d{0,2}/)[0];
}
export const getTooltip = (link: {
p99: number; p99: number;
errorRate: number; errorRate: number;
callRate: number; callRate: number;
id: string; id: string;
}) => { }) => {
return `<div style="color:#333333;padding:12px;background: white;border-radius: 2px;"> return `<div style="color:#333333;padding:12px;background: white;border-radius: 2px;">
<div style="font-weight:bold; margin-bottom:16px;">${node.id}</div>
<div class="keyval"> <div class="keyval">
<div class="key">P99 latency:</div> <div class="key">P99 latency:</div>
<div class="val">${node.p99 / 1000000}ms</div> <div class="val">${getRound2DigitsAfterDecimal(link.p99/ 1000000)}ms</div>
</div> </div>
<div class="keyval"> <div class="keyval">
<div class="key">Request:</div> <div class="key">Request:</div>
<div class="val">${node.callRate}/sec</div> <div class="val">${getRound2DigitsAfterDecimal(link.callRate)}/sec</div>
</div> </div>
<div class="keyval"> <div class="keyval">
<div class="key">Error Rate:</div> <div class="key">Error Rate:</div>
<div class="val">${node.errorRate}%</div> <div class="val">${getRound2DigitsAfterDecimal(link.errorRate)}%</div>
</div> </div>
</div>`; </div>`;
}; };

View File

@ -3,7 +3,8 @@
// import getExternalError from 'api/metrics/getExternalError'; // import getExternalError from 'api/metrics/getExternalError';
// import getExternalService from 'api/metrics/getExternalService'; // import getExternalService from 'api/metrics/getExternalService';
import getServiceOverview from 'api/metrics/getServiceOverview'; import getServiceOverview from 'api/metrics/getServiceOverview';
import getTopEndPoints from 'api/metrics/getTopEndPoints'; import getTopLevelOperations from 'api/metrics/getTopLevelOperations';
import getTopOperations from 'api/metrics/getTopOperations';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import GetMinMax from 'lib/getMinMax'; import GetMinMax from 'lib/getMinMax';
import getStep from 'lib/getStep'; import getStep from 'lib/getStep';
@ -46,7 +47,8 @@ export const GetInitialData = (
// getExternalErrorResponse, // getExternalErrorResponse,
// getExternalServiceResponse, // getExternalServiceResponse,
getServiceOverviewResponse, getServiceOverviewResponse,
getTopEndPointsResponse, getTopOperationsResponse,
getTopLevelOperationsResponse,
] = await Promise.all([ ] = await Promise.all([
// getDBOverView({ // getDBOverView({
// ...props, // ...props,
@ -67,12 +69,15 @@ export const GetInitialData = (
step: getStep({ start: minTime, end: maxTime, inputFormat: 'ns' }), step: getStep({ start: minTime, end: maxTime, inputFormat: 'ns' }),
selectedTags: props.selectedTags, selectedTags: props.selectedTags,
}), }),
getTopEndPoints({ getTopOperations({
end: maxTime, end: maxTime,
service: props.serviceName, service: props.serviceName,
start: minTime, start: minTime,
selectedTags: props.selectedTags, selectedTags: props.selectedTags,
}), }),
getTopLevelOperations({
service: props.serviceName,
}),
]); ]);
if ( if (
@ -81,7 +86,8 @@ export const GetInitialData = (
// getExternalErrorResponse.statusCode === 200 && // getExternalErrorResponse.statusCode === 200 &&
// getExternalServiceResponse.statusCode === 200 && // getExternalServiceResponse.statusCode === 200 &&
getServiceOverviewResponse.statusCode === 200 && getServiceOverviewResponse.statusCode === 200 &&
getTopEndPointsResponse.statusCode === 200 getTopOperationsResponse.statusCode === 200 &&
getTopLevelOperationsResponse.statusCode === 200
) { ) {
dispatch({ dispatch({
type: 'GET_INTIAL_APPLICATION_DATA', type: 'GET_INTIAL_APPLICATION_DATA',
@ -91,7 +97,8 @@ export const GetInitialData = (
// externalError: getExternalErrorResponse.payload, // externalError: getExternalErrorResponse.payload,
// externalService: getExternalServiceResponse.payload, // externalService: getExternalServiceResponse.payload,
serviceOverview: getServiceOverviewResponse.payload, serviceOverview: getServiceOverviewResponse.payload,
topEndPoints: getTopEndPointsResponse.payload, topOperations: getTopOperationsResponse.payload,
topLevelOperations: getTopLevelOperationsResponse.payload,
}, },
}); });
} else { } else {
@ -99,8 +106,9 @@ export const GetInitialData = (
type: 'GET_INITIAL_APPLICATION_ERROR', type: 'GET_INITIAL_APPLICATION_ERROR',
payload: { payload: {
errorMessage: errorMessage:
getTopEndPointsResponse.error || getTopOperationsResponse.error ||
getServiceOverviewResponse.error || getServiceOverviewResponse.error ||
getTopLevelOperationsResponse.error ||
// getExternalServiceResponse.error || // getExternalServiceResponse.error ||
// getExternalErrorResponse.error || // getExternalErrorResponse.error ||
// getExternalAverageDurationResponse.error || // getExternalAverageDurationResponse.error ||

View File

@ -6,26 +6,16 @@ import { ActionTypes } from './types';
export interface ServiceMapStore { export interface ServiceMapStore {
items: ServicesMapItem[]; items: ServicesMapItem[];
services: ServicesItem[];
loading: boolean; loading: boolean;
} }
export interface ServicesItem {
serviceName: string;
p99: number;
avgDuration: number;
numCalls: number;
callRate: number;
numErrors: number;
errorRate: number;
num4XX: number;
fourXXRate: number;
}
export interface ServicesMapItem { export interface ServicesMapItem {
parent: string; parent: string;
child: string; child: string;
callCount: number; callCount: number;
callRate: number;
errorRate: number;
p99: number;
} }
export interface ServiceMapItemAction { export interface ServiceMapItemAction {
@ -33,11 +23,6 @@ export interface ServiceMapItemAction {
payload: ServicesMapItem[]; payload: ServicesMapItem[];
} }
export interface ServicesAction {
type: ActionTypes.getServices;
payload: ServicesItem[];
}
export interface ServiceMapLoading { export interface ServiceMapLoading {
type: ActionTypes.serviceMapLoading; type: ActionTypes.serviceMapLoading;
payload: { payload: {
@ -55,19 +40,13 @@ export const getDetailedServiceMapItems = (globalTime: GlobalTime) => {
end, end,
tags: [], tags: [],
}; };
const [serviceMapDependenciesResponse, response] = await Promise.all([ const [dependencyGraphResponse] = await Promise.all([
api.post<ServicesMapItem[]>(`/serviceMapDependencies`, serviceMapPayload), api.post<ServicesMapItem[]>(`/dependency_graph`, serviceMapPayload),
api.post<ServicesItem[]>(`/services`, serviceMapPayload),
]); ]);
dispatch<ServicesAction>({
type: ActionTypes.getServices,
payload: response.data,
});
dispatch<ServiceMapItemAction>({ dispatch<ServiceMapItemAction>({
type: ActionTypes.getServiceMapItems, type: ActionTypes.getServiceMapItems,
payload: serviceMapDependenciesResponse.data, payload: dependencyGraphResponse.data,
}); });
dispatch<ServiceMapLoading>({ dispatch<ServiceMapLoading>({

View File

@ -1,8 +1,4 @@
import { import { ServiceMapItemAction, ServiceMapLoading } from './serviceMap';
ServiceMapItemAction,
ServiceMapLoading,
ServicesAction,
} from './serviceMap';
import { GetUsageDataAction } from './usage'; import { GetUsageDataAction } from './usage';
export enum ActionTypes { export enum ActionTypes {
@ -17,6 +13,5 @@ export enum ActionTypes {
export type Action = export type Action =
| GetUsageDataAction | GetUsageDataAction
| ServicesAction
| ServiceMapItemAction | ServiceMapItemAction
| ServiceMapLoading; | ServiceMapLoading;

View File

@ -18,4 +18,8 @@ const store = createStore(
), ),
); );
if (window !== undefined) {
window.store = store;
}
export default store; export default store;

View File

@ -10,7 +10,9 @@ const intitalState: GlobalReducer = {
maxTime: Date.now() * 1000000, maxTime: Date.now() * 1000000,
minTime: (Date.now() - 15 * 60 * 1000) * 1000000, minTime: (Date.now() - 15 * 60 * 1000) * 1000000,
loading: true, loading: true,
selectedTime: getDefaultOption(window.location.pathname), selectedTime: getDefaultOption(
typeof window !== 'undefined' ? window?.location?.pathname : '',
),
}; };
const globalTimeReducer = ( const globalTimeReducer = (

View File

@ -21,7 +21,7 @@ const InitialValue: InitialValueTypes = {
services: [], services: [],
dbOverView: [], dbOverView: [],
externalService: [], externalService: [],
topEndPoints: [], topOperations: [],
externalAverageDuration: [], externalAverageDuration: [],
externalError: [], externalError: [],
serviceOverview: [], serviceOverview: [],
@ -29,6 +29,7 @@ const InitialValue: InitialValueTypes = {
resourceAttributePromQLQuery: resourceAttributesQueryToPromQL( resourceAttributePromQLQuery: resourceAttributesQueryToPromQL(
GetResourceAttributeQueriesFromURL() || [], GetResourceAttributeQueriesFromURL() || [],
), ),
topLevelOperations: [],
}; };
const metrics = ( const metrics = (
@ -88,22 +89,24 @@ const metrics = (
case GET_INTIAL_APPLICATION_DATA: { case GET_INTIAL_APPLICATION_DATA: {
const { const {
// dbOverView, // dbOverView,
topEndPoints, topOperations,
serviceOverview, serviceOverview,
// externalService, // externalService,
// externalAverageDuration, // externalAverageDuration,
// externalError, // externalError,
topLevelOperations,
} = action.payload; } = action.payload;
return { return {
...state, ...state,
// dbOverView, // dbOverView,
topEndPoints, topOperations,
serviceOverview, serviceOverview,
// externalService, // externalService,
// externalAverageDuration, // externalAverageDuration,
// externalError, // externalError,
metricsApplicationLoading: false, metricsApplicationLoading: false,
topLevelOperations,
}; };
} }

View File

@ -2,7 +2,6 @@ import { Action, ActionTypes, ServiceMapStore } from 'store/actions';
const initialState: ServiceMapStore = { const initialState: ServiceMapStore = {
items: [], items: [],
services: [],
loading: true, loading: true,
}; };
@ -16,11 +15,6 @@ export const ServiceMapReducer = (
...state, ...state,
items: action.payload, items: action.payload,
}; };
case ActionTypes.getServices:
return {
...state,
services: action.payload,
};
case ActionTypes.serviceMapLoading: { case ActionTypes.serviceMapLoading: {
return { return {
...state, ...state,

View File

@ -5,7 +5,7 @@
import { IResourceAttributeQuery } from 'container/MetricsApplication/ResourceAttributesFilter/types'; import { IResourceAttributeQuery } from 'container/MetricsApplication/ResourceAttributesFilter/types';
import { ServicesList } from 'types/api/metrics/getService'; import { ServicesList } from 'types/api/metrics/getService';
import { ServiceOverview } from 'types/api/metrics/getServiceOverview'; import { ServiceOverview } from 'types/api/metrics/getServiceOverview';
import { TopEndPoints } from 'types/api/metrics/getTopEndPoints'; import { TopOperations } from 'types/api/metrics/getTopOperations';
export const GET_SERVICE_LIST_SUCCESS = 'GET_SERVICE_LIST_SUCCESS'; export const GET_SERVICE_LIST_SUCCESS = 'GET_SERVICE_LIST_SUCCESS';
export const GET_SERVICE_LIST_LOADING_START = 'GET_SERVICE_LIST_LOADING_START'; export const GET_SERVICE_LIST_LOADING_START = 'GET_SERVICE_LIST_LOADING_START';
@ -38,12 +38,13 @@ export interface GetServiceListError {
export interface GetInitialApplicationData { export interface GetInitialApplicationData {
type: typeof GET_INTIAL_APPLICATION_DATA; type: typeof GET_INTIAL_APPLICATION_DATA;
payload: { payload: {
topEndPoints: TopEndPoints[]; topOperations: TopOperations[];
// dbOverView: DBOverView[]; // dbOverView: DBOverView[];
// externalService: ExternalService[]; // externalService: ExternalService[];
// externalAverageDuration: ExternalAverageDuration[]; // externalAverageDuration: ExternalAverageDuration[];
// externalError: ExternalError[]; // externalError: ExternalError[];
serviceOverview: ServiceOverview[]; serviceOverview: ServiceOverview[];
topLevelOperations: string[];
}; };
} }

View File

@ -18,6 +18,8 @@ export interface AlertDef {
annotations?: Labels; annotations?: Labels;
evalWindow?: string; evalWindow?: string;
source?: string; source?: string;
disabled?: boolean;
preferredChannels?: string[];
} }
export interface RuleCondition { export interface RuleCondition {

View File

@ -1,7 +1,7 @@
import { Alerts } from './getAll'; import { AlertDef } from './def';
export interface Props { export interface Props {
id: Alerts['id']; id: AlertDef['id'];
} }
export interface PayloadProps { export interface PayloadProps {

View File

@ -4,6 +4,13 @@ export interface Props {
id: AlertDef['id']; id: AlertDef['id'];
} }
export interface GettableAlert extends AlertDef {
id: number;
alert: string;
state: string;
disabled: boolean;
}
export type PayloadProps = { export type PayloadProps = {
data: AlertDef; data: GettableAlert;
}; };

View File

@ -1,32 +1,3 @@
export interface Alerts { import { GettableAlert } from './get';
labels: AlertsLabel;
annotations: {
description: string;
summary: string;
[key: string]: string;
};
state: string;
name: string;
id: number;
endsAt: string;
fingerprint: string;
generatorURL: string;
receivers: Receivers[];
startsAt: string;
status: {
inhibitedBy: [];
silencedBy: [];
state: string;
};
updatedAt: string;
}
interface Receivers { export type PayloadProps = GettableAlert[];
name: string;
}
interface AlertsLabel {
[key: string]: string;
}
export type PayloadProps = Alerts[];

View File

@ -1,4 +1,4 @@
import { Alerts } from './getAll'; import { AlertDef } from './def';
export interface Props { export interface Props {
silenced: boolean; silenced: boolean;
@ -7,8 +7,8 @@ export interface Props {
[key: string]: string | boolean; [key: string]: string | boolean;
} }
export interface Group { export interface Group {
alerts: Alerts[]; alerts: AlertDef[];
label: Alerts['labels']; label: AlertDef['labels'];
receiver: { receiver: {
[key: string]: string; [key: string]: string;
}; };

View File

@ -1,4 +1,33 @@
import { Alerts } from './getAll'; export interface Alerts {
labels: AlertsLabel;
annotations: {
description: string;
summary: string;
[key: string]: string;
};
state: string;
name: string;
id: number;
endsAt: string;
fingerprint: string;
generatorURL: string;
receivers: Receivers[];
startsAt: string;
status: {
inhibitedBy: [];
silencedBy: [];
state: string;
};
updatedAt: string;
}
interface Receivers {
name: string;
}
interface AlertsLabel {
[key: string]: string;
}
export interface Props { export interface Props {
silenced: boolean; silenced: boolean;

View File

@ -0,0 +1,12 @@
import { GettableAlert } from './get';
export type PayloadProps = GettableAlert;
export interface PatchProps {
disabled?: boolean;
}
export interface Props {
id?: number;
data: PatchProps;
}

View File

@ -0,0 +1,10 @@
import { AlertDef } from 'types/api/alerts/def';
export interface Props {
data: AlertDef;
}
export interface PayloadProps {
alertCount: number;
message: string;
}

View File

@ -0,0 +1,7 @@
export type TopLevelOperations = string[];
export interface Props {
service: string;
}
export type PayloadProps = TopLevelOperations;

View File

@ -1,6 +1,6 @@
import { Tags } from 'types/reducer/trace'; import { Tags } from 'types/reducer/trace';
export interface TopEndPoints { export interface TopOperations {
name: string; name: string;
numCalls: number; numCalls: number;
p50: number; p50: number;
@ -15,4 +15,4 @@ export interface Props {
selectedTags: Tags[]; selectedTags: Tags[];
} }
export type PayloadProps = TopEndPoints[]; export type PayloadProps = TopOperations[];

View File

@ -5,7 +5,7 @@ import { ExternalError } from 'types/api/metrics/getExternalError';
import { ExternalService } from 'types/api/metrics/getExternalService'; import { ExternalService } from 'types/api/metrics/getExternalService';
import { ServicesList } from 'types/api/metrics/getService'; import { ServicesList } from 'types/api/metrics/getService';
import { ServiceOverview } from 'types/api/metrics/getServiceOverview'; import { ServiceOverview } from 'types/api/metrics/getServiceOverview';
import { TopEndPoints } from 'types/api/metrics/getTopEndPoints'; import { TopOperations } from 'types/api/metrics/getTopOperations';
interface MetricReducer { interface MetricReducer {
services: ServicesList[]; services: ServicesList[];
@ -15,12 +15,13 @@ interface MetricReducer {
errorMessage: string; errorMessage: string;
dbOverView: DBOverView[]; dbOverView: DBOverView[];
externalService: ExternalService[]; externalService: ExternalService[];
topEndPoints: TopEndPoints[]; topOperations: TopOperations[];
externalAverageDuration: ExternalAverageDuration[]; externalAverageDuration: ExternalAverageDuration[];
externalError: ExternalError[]; externalError: ExternalError[];
serviceOverview: ServiceOverview[]; serviceOverview: ServiceOverview[];
resourceAttributeQueries: IResourceAttributeQuery[]; resourceAttributeQueries: IResourceAttributeQuery[];
resourceAttributePromQLQuery: string; resourceAttributePromQLQuery: string;
topLevelOperations: string[];
} }
export default MetricReducer; export default MetricReducer;

View File

@ -0,0 +1,101 @@
import { expect, Page, test } from '@playwright/test';
import ROUTES from 'constants/routes';
import allErrorList from '../fixtures/api/allErrors/200.json';
import errorDetailSuccess from '../fixtures/api/errorDetails/200.json';
import errorDetailNotFound from '../fixtures/api/errorDetails/404.json';
import nextPreviousSuccess from '../fixtures/api/getNextPrev/200.json';
import { loginApi } from '../fixtures/common';
import { JsonApplicationType } from '../fixtures/constant';
let page: Page;
const timestamp = '1657794588955274000';
test.describe('Expections Details', async () => {
test.beforeEach(async ({ baseURL, browser }) => {
const context = await browser.newContext({ storageState: 'tests/auth.json' });
const newPage = await context.newPage();
await loginApi(newPage);
await newPage.goto(`${baseURL}${ROUTES.APPLICATION}`);
page = newPage;
});
test('Should have not found when api return 404', async () => {
await Promise.all([
page.route('**/errorFromGroupID**', (route) =>
route.fulfill({
status: 404,
contentType: JsonApplicationType,
body: JSON.stringify(errorDetailNotFound),
}),
),
page.route('**/nextPrevErrorIDs**', (route) =>
route.fulfill({
status: 404,
contentType: JsonApplicationType,
body: JSON.stringify([]),
}),
),
]);
await page.goto(
`${ROUTES.ERROR_DETAIL}?groupId=${allErrorList[0].groupID}&timestamp=${timestamp}`,
{
waitUntil: 'networkidle',
},
);
const NoDataLocator = page.locator('text=Not Found');
const isVisible = await NoDataLocator.isVisible();
const text = await NoDataLocator.textContent();
expect(isVisible).toBe(true);
expect(text).toBe('Not Found');
expect(await page.screenshot()).toMatchSnapshot();
});
test('Render Success Data when 200 from details page', async () => {
await Promise.all([
page.route('**/errorFromGroupID**', (route) =>
route.fulfill({
status: 200,
contentType: JsonApplicationType,
body: JSON.stringify(errorDetailSuccess),
}),
),
page.route('**/nextPrevErrorIDs**', (route) =>
route.fulfill({
status: 200,
contentType: JsonApplicationType,
body: JSON.stringify(nextPreviousSuccess),
}),
),
]);
await page.goto(
`${ROUTES.ERROR_DETAIL}?groupId=${allErrorList[0].groupID}&timestamp=${timestamp}`,
{
waitUntil: 'networkidle',
},
);
const traceDetailButton = page.locator('text=See the error in trace graph');
const olderButton = page.locator('text=Older');
const newerButton = page.locator(`text=Newer`);
expect(await traceDetailButton.isVisible()).toBe(true);
expect(await olderButton.isVisible()).toBe(true);
expect(await newerButton.isVisible()).toBe(true);
expect(await traceDetailButton.textContent()).toBe(
'See the error in trace graph',
);
expect(await olderButton.textContent()).toBe('Older');
expect(await newerButton.textContent()).toBe('Newer');
expect(await page.screenshot()).toMatchSnapshot();
});
});

View File

@ -0,0 +1,148 @@
import { expect, Page, test } from '@playwright/test';
import ROUTES from 'constants/routes';
import successAllErrors from '../fixtures/api/allErrors/200.json';
import { loginApi } from '../fixtures/common';
import { JsonApplicationType } from '../fixtures/constant';
const noDataTableData = async (page: Page): Promise<void> => {
const text = page.locator('text=No Data');
expect(text).toBeVisible();
expect(text).toHaveText('No Data');
const textType = [
'Exception Type',
'Error Message',
'Last Seen',
'First Seen',
'Application',
];
textType.forEach(async (text) => {
const textLocator = page.locator(`text=${text}`);
const textContent = await textLocator.textContent();
expect(textContent).toBe(text);
expect(textLocator).not.toBeNull();
expect(textLocator).toBeVisible();
await expect(textLocator).toHaveText(`${text}`);
});
};
let page: Page;
test.describe('Expections page', async () => {
test.beforeEach(async ({ baseURL, browser }) => {
const context = await browser.newContext({ storageState: 'tests/auth.json' });
const newPage = await context.newPage();
await loginApi(newPage);
await newPage.goto(`${baseURL}${ROUTES.APPLICATION}`);
page = newPage;
});
test('Should have a valid route', async () => {
await page.goto(ROUTES.ALL_ERROR);
await expect(page).toHaveURL(ROUTES.ALL_ERROR);
expect(await page.screenshot()).toMatchSnapshot();
});
test('Should have a valid Breadcrumbs', async () => {
await page.goto(ROUTES.ALL_ERROR, {
waitUntil: 'networkidle',
});
const expectionsLocator = page.locator('a:has-text("Exceptions")');
await expect(expectionsLocator).toBeVisible();
await expect(expectionsLocator).toHaveText('Exceptions');
await expect(expectionsLocator).toHaveAttribute('href', ROUTES.ALL_ERROR);
expect(await page.screenshot()).toMatchSnapshot();
});
test('Should render the page with 404 status', async () => {
await page.route('**listErrors', (route) =>
route.fulfill({
status: 404,
contentType: JsonApplicationType,
body: JSON.stringify([]),
}),
);
await page.goto(ROUTES.ALL_ERROR, {
waitUntil: 'networkidle',
});
await noDataTableData(page);
expect(await page.screenshot()).toMatchSnapshot();
});
test('Should render the page with 500 status in antd notification with no data antd table', async () => {
await page.route(`**/listErrors**`, (route) =>
route.fulfill({
status: 500,
contentType: JsonApplicationType,
body: JSON.stringify([]),
}),
);
await page.goto(ROUTES.ALL_ERROR, {
waitUntil: 'networkidle',
});
const text = 'Something went wrong';
const el = page.locator(`text=${text}`);
expect(el).toBeVisible();
expect(el).toHaveText(`${text}`);
expect(await el.getAttribute('disabled')).toBe(null);
await noDataTableData(page);
expect(await page.screenshot()).toMatchSnapshot();
});
test('Should render data in antd table', async () => {
await Promise.all([
page.route(`**/listErrors**`, (route) =>
route.fulfill({
status: 200,
contentType: JsonApplicationType,
body: JSON.stringify(successAllErrors),
}),
),
page.route('**/countErrors**', (route) =>
route.fulfill({
status: 200,
contentType: JsonApplicationType,
body: JSON.stringify(200),
}),
),
]);
await page.goto(ROUTES.ALL_ERROR, {
waitUntil: 'networkidle',
});
await page.evaluate(() => window.scrollTo(0, document.body.scrollHeight));
const expectionType = page.locator(
`td:has-text("${successAllErrors[1].exceptionType}")`,
);
expect(expectionType).toBeVisible();
const second = page.locator('li > a:has-text("2") >> nth=0');
const isVisisble = await second.isVisible();
expect(isVisisble).toBe(true);
expect(await page.screenshot()).toMatchSnapshot();
});
});

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@ -0,0 +1,92 @@
[
{
"exceptionType": "ConnectionError",
"exceptionMessage": "HTTPSConnectionPool(host='run.mocekdy.io', port=443): Max retries exceeded with url: /v3/1cwb67153-a6ac-4aae-aca6-273ed68b5d9e (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ce9c10\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 2,
"lastSeen": "2022-07-14T10:29:48.955274Z",
"firstSeen": "2022-07-14T10:29:48.950721Z",
"serviceName": "1rfflaskAp",
"groupID": "e24d35bda98c5499a5c8df3ba61b0238"
},
{
"exceptionType": "NameError",
"exceptionMessage": "name 'listf' is not defined",
"exceptionCount": 8,
"lastSeen": "2022-07-14T10:30:42.411035Z",
"firstSeen": "2022-07-14T10:29:45.426784Z",
"serviceName": "1rfflaskAp",
"groupID": "efc46adcd5e87b65f8f244cba683b265"
},
{
"exceptionType": "ZeroDivisionError",
"exceptionMessage": "division by zero",
"exceptionCount": 1,
"lastSeen": "2022-07-14T10:29:54.195996Z",
"firstSeen": "2022-07-14T10:29:54.195996Z",
"serviceName": "1rfflaskAp",
"groupID": "a49058b540eef9aefe159d84f1a2b6df"
},
{
"exceptionType": "MaxRetryError",
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ec2640\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 1,
"lastSeen": "2022-07-14T10:29:49.471402Z",
"firstSeen": "2022-07-14T10:29:49.471402Z",
"serviceName": "1rfflaskAp",
"groupID": "e59d39239f4d48842d83e3cc4cf53249"
},
{
"exceptionType": "MaxRetryError",
"exceptionMessage": "HTTPSConnectionPool(host='run.mocekdy.io', port=443): Max retries exceeded with url: /v3/1cwb67153-a6ac-4aae-aca6-273ed68b5d9e (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ce9c10\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 1,
"lastSeen": "2022-07-14T10:29:48.947579Z",
"firstSeen": "2022-07-14T10:29:48.947579Z",
"serviceName": "1rfflaskAp",
"groupID": "14d18a6fb1cd3f541de1566530e75486"
},
{
"exceptionType": "ConnectionError",
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ec2640\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 2,
"lastSeen": "2022-07-14T10:29:49.476718Z",
"firstSeen": "2022-07-14T10:29:49.472271Z",
"serviceName": "1rfflaskAp",
"groupID": "bf6d88d10397ca3194b96a10f4719031"
},
{
"exceptionType": "github.com/gin-gonic/gin.Error",
"exceptionMessage": "Sample Error",
"exceptionCount": 6,
"lastSeen": "2022-07-15T18:55:32.3538096Z",
"firstSeen": "2022-07-14T14:47:19.874387Z",
"serviceName": "goApp",
"groupID": "b4fd099280072d45318e1523d82aa9c1"
},
{
"exceptionType": "MaxRetryError",
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x10801b490\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 1,
"lastSeen": "2022-07-14T11:07:06.560593Z",
"firstSeen": "2022-07-14T11:07:06.560593Z",
"serviceName": "samplFlaskApp",
"groupID": "1945671c945b10641e73b0fe28c4d486"
},
{
"exceptionType": "ConnectionError",
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x10801b490\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 2,
"lastSeen": "2022-07-14T11:07:06.56493Z",
"firstSeen": "2022-07-14T11:07:06.561074Z",
"serviceName": "samplFlaskApp",
"groupID": "5bea5295cac187404005f9c96e71aa53"
},
{
"exceptionType": "ConnectionError",
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108031820\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
"exceptionCount": 2,
"lastSeen": "2022-07-14T11:07:06.363977Z",
"firstSeen": "2022-07-14T11:07:06.361163Z",
"serviceName": "samplFlaskApp",
"groupID": "52a1fbe033453d806c0f24ba39168a78"
}
]

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,5 @@
{
"error": "Error/Exception not found",
"errorType": "not_found",
"status": "error"
}

View File

@ -0,0 +1,7 @@
{
"nextErrorID": "",
"nextTimestamp": "0001-01-01T00:00:00Z",
"prevErrorID": "217133e5f7df429abd31b507859ea513",
"prevTimestamp": "2022-07-14T10:29:48.950721Z",
"groupID": "e24d35bda98c5499a5c8df3ba61b0238"
}

View File

@ -6,3 +6,5 @@ export const validPassword = 'SamplePassword98@@';
export const getStartedButtonSelector = 'button[data-attr="signup"]'; export const getStartedButtonSelector = 'button[data-attr="signup"]';
export const confirmPasswordSelector = '#password-confirm-error'; export const confirmPasswordSelector = '#password-confirm-error';
export const JsonApplicationType = 'application/json';

View File

@ -24,5 +24,6 @@ test.describe('Version API fail while loading login page', async () => {
expect(el).toBeVisible(); expect(el).toBeVisible();
expect(el).toHaveText(`${text}`); expect(el).toHaveText(`${text}`);
expect(await el.getAttribute('disabled')).toBe(null); expect(await el.getAttribute('disabled')).toBe(null);
expect(await page.screenshot()).toMatchSnapshot();
}); });
}); });

View File

@ -45,5 +45,6 @@ test.describe('Login Page', () => {
element.isVisible(); element.isVisible();
const text = await element.innerText(); const text = await element.innerText();
expect(text).toBe(`SigNoz ${version}`); expect(text).toBe(`SigNoz ${version}`);
expect(await page.screenshot()).toMatchSnapshot();
}); });
}); });

View File

@ -16,7 +16,17 @@ test.describe('Service Page', () => {
page = newPage; page = newPage;
}); });
test('Serice Page is rendered', async ({ baseURL }) => { test('Serice Page is rendered', async ({ baseURL }) => {
await expect(page).toHaveURL(`${baseURL}${ROUTES.APPLICATION}`); await expect(page).toHaveURL(`${baseURL}${ROUTES.APPLICATION}`);
expect(await page.screenshot()).toMatchSnapshot();
});
test('Logged In must be true', async () => {
const { app } = await page.evaluate(() => window.store.getState());
const { isLoggedIn } = app;
expect(isLoggedIn).toBe(true);
}); });
}); });

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

View File

@ -77,6 +77,7 @@ test.describe('Sign Up Page', () => {
await buttonSignupButton.click(); await buttonSignupButton.click();
expect(page).toHaveURL(`${baseURL}${ROUTES.SIGN_UP}`); expect(page).toHaveURL(`${baseURL}${ROUTES.SIGN_UP}`);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Invite link validation', async ({ baseURL, page }) => { test('Invite link validation', async ({ baseURL, page }) => {
@ -87,6 +88,7 @@ test.describe('Sign Up Page', () => {
const messageText = await page.locator(`text=${message}`).innerText(); const messageText = await page.locator(`text=${message}`).innerText();
expect(messageText).toBe(message); expect(messageText).toBe(message);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('User Sign up with valid details', async ({ baseURL, page, context }) => { test('User Sign up with valid details', async ({ baseURL, page, context }) => {
@ -125,6 +127,7 @@ test.describe('Sign Up Page', () => {
await context.storageState({ await context.storageState({
path: 'tests/auth.json', path: 'tests/auth.json',
}); });
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Empty name with valid details', async ({ baseURL, page }) => { test('Empty name with valid details', async ({ baseURL, page }) => {
@ -142,6 +145,7 @@ test.describe('Sign Up Page', () => {
const gettingStartedButton = page.locator(getStartedButtonSelector); const gettingStartedButton = page.locator(getStartedButtonSelector);
expect(await gettingStartedButton.isDisabled()).toBe(true); expect(await gettingStartedButton.isDisabled()).toBe(true);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Empty Company name with valid details', async ({ baseURL, page }) => { test('Empty Company name with valid details', async ({ baseURL, page }) => {
@ -159,6 +163,7 @@ test.describe('Sign Up Page', () => {
const gettingStartedButton = page.locator(getStartedButtonSelector); const gettingStartedButton = page.locator(getStartedButtonSelector);
expect(await gettingStartedButton.isDisabled()).toBe(true); expect(await gettingStartedButton.isDisabled()).toBe(true);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Empty Email with valid details', async ({ baseURL, page }) => { test('Empty Email with valid details', async ({ baseURL, page }) => {
@ -176,6 +181,7 @@ test.describe('Sign Up Page', () => {
const gettingStartedButton = page.locator(getStartedButtonSelector); const gettingStartedButton = page.locator(getStartedButtonSelector);
expect(await gettingStartedButton.isDisabled()).toBe(true); expect(await gettingStartedButton.isDisabled()).toBe(true);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Empty Password and confirm password with valid details', async ({ test('Empty Password and confirm password with valid details', async ({
@ -200,6 +206,7 @@ test.describe('Sign Up Page', () => {
// password validation message is not present // password validation message is not present
const locator = await page.locator(confirmPasswordSelector).isVisible(); const locator = await page.locator(confirmPasswordSelector).isVisible();
expect(locator).toBe(false); expect(locator).toBe(false);
expect(await page.screenshot()).toMatchSnapshot();
}); });
test('Miss Match Password and confirm password with valid details', async ({ test('Miss Match Password and confirm password with valid details', async ({
@ -220,5 +227,6 @@ test.describe('Sign Up Page', () => {
// password validation message is not present // password validation message is not present
const locator = await page.locator(confirmPasswordSelector).isVisible(); const locator = await page.locator(confirmPasswordSelector).isVisible();
expect(locator).toBe(true); expect(locator).toBe(true);
expect(await page.screenshot()).toMatchSnapshot();
}); });
}); });

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 66 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 54 KiB

View File

@ -20,7 +20,7 @@ RUN go mod download -x
# Add the sources and proceed with build # Add the sources and proceed with build
ADD . . ADD . .
RUN go build -a -ldflags "-linkmode external -extldflags '-static' -s -w $LD_FLAGS" -o ./bin/query-service ./main.go RUN go build -tags timetzdata -a -ldflags "-linkmode external -extldflags '-static' -s -w $LD_FLAGS" -o ./bin/query-service ./main.go
RUN chmod +x ./bin/query-service RUN chmod +x ./bin/query-service

View File

@ -6,8 +6,37 @@ Query service is the interface between frontend and databases. It is written in
- parse response from databases and handle error if any - parse response from databases and handle error if any
- clickhouse response in the format accepted by Frontend - clickhouse response in the format accepted by Frontend
# Complete the clickhouse setup locally.
https://github.com/SigNoz/signoz/blob/main/CONTRIBUTING.md#to-run-clickhouse-setup-recommended-for-local-development
- Comment out the query-service and the frontend section in `signoz/deploy/docker/clickhouse-setup/docker-compose.yaml`
- Change the alertmanager section in `signoz/deploy/docker/clickhouse-setup/docker-compose.yaml` as follows:
```console
alertmanager:
image: signoz/alertmanager:0.23.0-0.1
volumes:
- ./data/alertmanager:/data
expose:
- "9093"
ports:
- "8080:9093"
# depends_on:
# query-service:
# condition: service_healthy
restart: on-failure
command:
- --queryService.url=http://172.17.0.1:8085
- --storage.path=/data
```
- Run the following:
```console
cd signoz/
If you are using x86_64 processors (All Intel/AMD processors) run sudo make run-x86
If you are on arm64 processors (Apple M1 Macs) run sudo make run-arm
```
#### Backend Configuration
#### Configuration
- Open ./constants/constants.go - Open ./constants/constants.go
- Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \ - Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \
with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".``` with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".```
@ -15,8 +44,9 @@ Query service is the interface between frontend and databases. It is written in
- Query Service needs below `env` variables to run: - Query Service needs below `env` variables to run:
``` ```
ClickHouseUrl=tcp://localhost:9001 export ClickHouseUrl=tcp://localhost:9001
STORAGE=clickhouse export STORAGE=clickhouse
export ALERTMANAGER_API_PREFIX=http://localhost:9093/api/
``` ```
<!-- The above values are the default ones used by SigNoz and are kept at `deploy/kubernetes/platform/signoz-charts/query-service/values.yaml` --> <!-- The above values are the default ones used by SigNoz and are kept at `deploy/kubernetes/platform/signoz-charts/query-service/values.yaml` -->
@ -28,5 +58,24 @@ go build -o build/query-service main.go
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service
``` ```
# Frontend Configuration for local query-service.
- Set the following environment variables
```console
export FRONTEND_API_ENDPOINT=http://localhost:8080
```
- Run the following
```console
cd signoz\frontend\
yarn install
yarn dev
```
## Note:
If you use go version 1.18 for development and contributions, then please checkout the following issue.
https://github.com/SigNoz/signoz/issues/1371
#### Docker Images #### Docker Images
The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service

View File

@ -18,16 +18,19 @@ const (
) )
const ( const (
defaultDatasource string = "tcp://localhost:9000" defaultDatasource string = "tcp://localhost:9000"
defaultTraceDB string = "signoz_traces" defaultTraceDB string = "signoz_traces"
defaultOperationsTable string = "signoz_operations" defaultOperationsTable string = "signoz_operations"
defaultIndexTable string = "signoz_index_v2" defaultIndexTable string = "signoz_index_v2"
defaultErrorTable string = "signoz_error_index_v2" defaultErrorTable string = "signoz_error_index_v2"
defaulDurationTable string = "durationSortMV" defaultDurationTable string = "durationSortMV"
defaultSpansTable string = "signoz_spans" defaultUsageExplorerTable string = "usage_explorer"
defaultWriteBatchDelay time.Duration = 5 * time.Second defaultSpansTable string = "signoz_spans"
defaultWriteBatchSize int = 10000 defaultDependencyGraphTable string = "dependency_graph_minutes"
defaultEncoding Encoding = EncodingJSON defaultTopLevelOperationsTable string = "top_level_operations"
defaultWriteBatchDelay time.Duration = 5 * time.Second
defaultWriteBatchSize int = 10000
defaultEncoding Encoding = EncodingJSON
) )
const ( const (
@ -43,19 +46,22 @@ const (
// NamespaceConfig is Clickhouse's internal configuration data // NamespaceConfig is Clickhouse's internal configuration data
type namespaceConfig struct { type namespaceConfig struct {
namespace string namespace string
Enabled bool Enabled bool
Datasource string Datasource string
TraceDB string TraceDB string
OperationsTable string OperationsTable string
IndexTable string IndexTable string
DurationTable string DurationTable string
SpansTable string UsageExplorerTable string
ErrorTable string SpansTable string
WriteBatchDelay time.Duration ErrorTable string
WriteBatchSize int DependencyGraphTable string
Encoding Encoding TopLevelOperationsTable string
Connector Connector WriteBatchDelay time.Duration
WriteBatchSize int
Encoding Encoding
Connector Connector
} }
// Connecto defines how to connect to the database // Connecto defines how to connect to the database
@ -102,19 +108,22 @@ func NewOptions(datasource string, primaryNamespace string, otherNamespaces ...s
options := &Options{ options := &Options{
primary: &namespaceConfig{ primary: &namespaceConfig{
namespace: primaryNamespace, namespace: primaryNamespace,
Enabled: true, Enabled: true,
Datasource: datasource, Datasource: datasource,
TraceDB: defaultTraceDB, TraceDB: defaultTraceDB,
OperationsTable: defaultOperationsTable, OperationsTable: defaultOperationsTable,
IndexTable: defaultIndexTable, IndexTable: defaultIndexTable,
ErrorTable: defaultErrorTable, ErrorTable: defaultErrorTable,
DurationTable: defaulDurationTable, DurationTable: defaultDurationTable,
SpansTable: defaultSpansTable, UsageExplorerTable: defaultUsageExplorerTable,
WriteBatchDelay: defaultWriteBatchDelay, SpansTable: defaultSpansTable,
WriteBatchSize: defaultWriteBatchSize, DependencyGraphTable: defaultDependencyGraphTable,
Encoding: defaultEncoding, TopLevelOperationsTable: defaultTopLevelOperationsTable,
Connector: defaultConnector, WriteBatchDelay: defaultWriteBatchDelay,
WriteBatchSize: defaultWriteBatchSize,
Encoding: defaultEncoding,
Connector: defaultConnector,
}, },
others: make(map[string]*namespaceConfig, len(otherNamespaces)), others: make(map[string]*namespaceConfig, len(otherNamespaces)),
} }

View File

@ -47,16 +47,17 @@ import (
) )
const ( const (
primaryNamespace = "clickhouse" primaryNamespace = "clickhouse"
archiveNamespace = "clickhouse-archive" archiveNamespace = "clickhouse-archive"
signozTraceDBName = "signoz_traces" signozTraceDBName = "signoz_traces"
signozDurationMVTable = "durationSort" signozDurationMVTable = "durationSort"
signozSpansTable = "signoz_spans" signozUsageExplorerTable = "usage_explorer"
signozErrorIndexTable = "signoz_error_index_v2" signozSpansTable = "signoz_spans"
signozTraceTableName = "signoz_index_v2" signozErrorIndexTable = "signoz_error_index_v2"
signozMetricDBName = "signoz_metrics" signozTraceTableName = "signoz_index_v2"
signozSampleTableName = "samples_v2" signozMetricDBName = "signoz_metrics"
signozTSTableName = "time_series_v2" signozSampleTableName = "samples_v2"
signozTSTableName = "time_series_v2"
minTimespanForProgressiveSearch = time.Hour minTimespanForProgressiveSearch = time.Hour
minTimespanForProgressiveSearchMargin = time.Minute minTimespanForProgressiveSearchMargin = time.Minute
@ -75,16 +76,19 @@ var (
// SpanWriter for reading spans from ClickHouse // SpanWriter for reading spans from ClickHouse
type ClickHouseReader struct { type ClickHouseReader struct {
db clickhouse.Conn db clickhouse.Conn
localDB *sqlx.DB localDB *sqlx.DB
traceDB string traceDB string
operationsTable string operationsTable string
durationTable string durationTable string
indexTable string usageExplorerTable string
errorTable string indexTable string
spansTable string errorTable string
queryEngine *promql.Engine spansTable string
remoteStorage *remote.Storage dependencyGraphTable string
topLevelOperationsTable string
queryEngine *promql.Engine
remoteStorage *remote.Storage
promConfigFile string promConfigFile string
promConfig *config.Config promConfig *config.Config
@ -111,16 +115,19 @@ func NewReader(localDB *sqlx.DB, configFile string) *ClickHouseReader {
} }
return &ClickHouseReader{ return &ClickHouseReader{
db: db, db: db,
localDB: localDB, localDB: localDB,
traceDB: options.primary.TraceDB, traceDB: options.primary.TraceDB,
alertManager: alertManager, alertManager: alertManager,
operationsTable: options.primary.OperationsTable, operationsTable: options.primary.OperationsTable,
indexTable: options.primary.IndexTable, indexTable: options.primary.IndexTable,
errorTable: options.primary.ErrorTable, errorTable: options.primary.ErrorTable,
durationTable: options.primary.DurationTable, usageExplorerTable: options.primary.UsageExplorerTable,
spansTable: options.primary.SpansTable, durationTable: options.primary.DurationTable,
promConfigFile: configFile, spansTable: options.primary.SpansTable,
dependencyGraphTable: options.primary.DependencyGraphTable,
topLevelOperationsTable: options.primary.TopLevelOperationsTable,
promConfigFile: configFile,
} }
} }
@ -374,14 +381,21 @@ func (r *ClickHouseReader) GetChannel(id string) (*model.ChannelItem, *model.Api
idInt, _ := strconv.Atoi(id) idInt, _ := strconv.Atoi(id)
channel := model.ChannelItem{} channel := model.ChannelItem{}
query := fmt.Sprintf("SELECT id, created_at, updated_at, name, type, data data FROM notification_channels WHERE id=%d", idInt) query := "SELECT id, created_at, updated_at, name, type, data data FROM notification_channels WHERE id=? "
err := r.localDB.Get(&channel, query) stmt, err := r.localDB.Preparex(query)
zap.S().Info(query) zap.S().Info(query, idInt)
if err != nil { if err != nil {
zap.S().Debug("Error in processing sql query: ", err) zap.S().Debug("Error in preparing sql query for GetChannel : ", err)
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: err}
}
err = stmt.Get(&channel, idInt)
if err != nil {
zap.S().Debug(fmt.Sprintf("Error in getting channel with id=%d : ", idInt), err)
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: err} return nil, &model.ApiError{Typ: model.ErrorInternal, Err: err}
} }
@ -650,103 +664,153 @@ func (r *ClickHouseReader) GetServicesList(ctx context.Context) (*[]string, erro
return &services, nil return &services, nil
} }
func (r *ClickHouseReader) GetTopLevelOperations(ctx context.Context) (*map[string][]string, *model.ApiError) {
operations := map[string][]string{}
query := fmt.Sprintf(`SELECT DISTINCT name, serviceName FROM %s.%s`, r.traceDB, r.topLevelOperationsTable)
rows, err := r.db.Query(ctx, query)
if err != nil {
zap.S().Error("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
}
defer rows.Close()
for rows.Next() {
var name, serviceName string
if err := rows.Scan(&name, &serviceName); err != nil {
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("Error in reading data")}
}
if _, ok := operations[serviceName]; !ok {
operations[serviceName] = []string{}
}
operations[serviceName] = append(operations[serviceName], name)
}
return &operations, nil
}
func (r *ClickHouseReader) GetServices(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceItem, *model.ApiError) { func (r *ClickHouseReader) GetServices(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceItem, *model.ApiError) {
if r.indexTable == "" { if r.indexTable == "" {
return nil, &model.ApiError{Typ: model.ErrorExec, Err: ErrNoIndexTable} return nil, &model.ApiError{Typ: model.ErrorExec, Err: ErrNoIndexTable}
} }
topLevelOps, apiErr := r.GetTopLevelOperations(ctx)
if apiErr != nil {
return nil, apiErr
}
serviceItems := []model.ServiceItem{} serviceItems := []model.ServiceItem{}
var wg sync.WaitGroup
// limit the number of concurrent queries to not overload the clickhouse server
sem := make(chan struct{}, 10)
var mtx sync.RWMutex
query := fmt.Sprintf("SELECT serviceName, quantile(0.99)(durationNano) as p99, avg(durationNano) as avgDuration, count(*) as numCalls FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2'", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10)) for svc, ops := range *topLevelOps {
args := []interface{}{} sem <- struct{}{}
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args) wg.Add(1)
if errStatus != nil { go func(svc string, ops []string) {
return nil, errStatus defer wg.Done()
defer func() { <-sem }()
var serviceItem model.ServiceItem
var numErrors uint64
query := fmt.Sprintf(
`SELECT
quantile(0.99)(durationNano) as p99,
avg(durationNano) as avgDuration,
count(*) as numCalls
FROM %s.%s
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end`,
r.traceDB, r.indexTable,
)
errorQuery := fmt.Sprintf(
`SELECT
count(*) as numErrors
FROM %s.%s
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end AND statusCode=2`,
r.traceDB, r.indexTable,
)
args := []interface{}{}
args = append(args,
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
clickhouse.Named("serviceName", svc),
clickhouse.Named("names", ops),
)
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil {
zap.S().Error("Error in processing sql query: ", errStatus)
return
}
err := r.db.QueryRow(
ctx,
query,
args...,
).ScanStruct(&serviceItem)
if err != nil {
zap.S().Error("Error in processing sql query: ", err)
return
}
err = r.db.QueryRow(ctx, errorQuery, args...).Scan(&numErrors)
if err != nil {
zap.S().Error("Error in processing sql query: ", err)
return
}
serviceItem.ServiceName = svc
serviceItem.NumErrors = numErrors
mtx.Lock()
serviceItems = append(serviceItems, serviceItem)
mtx.Unlock()
}(svc, ops)
} }
query += " GROUP BY serviceName ORDER BY p99 DESC" wg.Wait()
err := r.db.Select(ctx, &serviceItems, query, args...)
zap.S().Info(query) for idx := range serviceItems {
serviceItems[idx].CallRate = float64(serviceItems[idx].NumCalls) / float64(queryParams.Period)
if err != nil { serviceItems[idx].ErrorRate = float64(serviceItems[idx].NumErrors) * 100 / float64(serviceItems[idx].NumCalls)
zap.S().Debug("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
} }
////////////////// Below block gets 5xx of services
serviceErrorItems := []model.ServiceItem{}
query = fmt.Sprintf("SELECT serviceName, count(*) as numErrors FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND (statusCode>=500 OR statusCode=2)", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
args = []interface{}{}
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil {
return nil, errStatus
}
query += " GROUP BY serviceName"
err = r.db.Select(ctx, &serviceErrorItems, query, args...)
zap.S().Info(query)
if err != nil {
zap.S().Debug("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
}
m5xx := make(map[string]uint64)
for j := range serviceErrorItems {
m5xx[serviceErrorItems[j].ServiceName] = serviceErrorItems[j].NumErrors
}
///////////////////////////////////////////
////////////////// Below block gets 4xx of services
service4xxItems := []model.ServiceItem{}
query = fmt.Sprintf("SELECT serviceName, count(*) as num4xx FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND statusCode>=400 AND statusCode<500", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
args = []interface{}{}
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil {
return nil, errStatus
}
query += " GROUP BY serviceName"
err = r.db.Select(ctx, &service4xxItems, query, args...)
zap.S().Info(query)
if err != nil {
zap.S().Debug("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
}
m4xx := make(map[string]uint64)
for j := range service4xxItems {
m4xx[service4xxItems[j].ServiceName] = service4xxItems[j].Num4XX
}
for i := range serviceItems {
if val, ok := m5xx[serviceItems[i].ServiceName]; ok {
serviceItems[i].NumErrors = val
}
if val, ok := m4xx[serviceItems[i].ServiceName]; ok {
serviceItems[i].Num4XX = val
}
serviceItems[i].CallRate = float64(serviceItems[i].NumCalls) / float64(queryParams.Period)
serviceItems[i].FourXXRate = float64(serviceItems[i].Num4XX) * 100 / float64(serviceItems[i].NumCalls)
serviceItems[i].ErrorRate = float64(serviceItems[i].NumErrors) * 100 / float64(serviceItems[i].NumCalls)
}
return &serviceItems, nil return &serviceItems, nil
} }
func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *model.GetServiceOverviewParams) (*[]model.ServiceOverviewItem, *model.ApiError) { func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *model.GetServiceOverviewParams) (*[]model.ServiceOverviewItem, *model.ApiError) {
topLevelOps, apiErr := r.GetTopLevelOperations(ctx)
if apiErr != nil {
return nil, apiErr
}
ops, ok := (*topLevelOps)[queryParams.ServiceName]
if !ok {
return nil, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("Service not found")}
}
namedArgs := []interface{}{
clickhouse.Named("interval", strconv.Itoa(int(queryParams.StepSeconds/60))),
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
clickhouse.Named("serviceName", queryParams.ServiceName),
clickhouse.Named("names", ops),
}
serviceOverviewItems := []model.ServiceOverviewItem{} serviceOverviewItems := []model.ServiceOverviewItem{}
query := fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %s minute) as time, quantile(0.99)(durationNano) as p99, quantile(0.95)(durationNano) as p95,quantile(0.50)(durationNano) as p50, count(*) as numCalls FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND serviceName='%s'", strconv.Itoa(int(queryParams.StepSeconds/60)), r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName) query := fmt.Sprintf(`
SELECT
toStartOfInterval(timestamp, INTERVAL @interval minute) as time,
quantile(0.99)(durationNano) as p99,
quantile(0.95)(durationNano) as p95,
quantile(0.50)(durationNano) as p50,
count(*) as numCalls
FROM %s.%s
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end`,
r.traceDB, r.indexTable,
)
args := []interface{}{} args := []interface{}{}
args = append(args, namedArgs...)
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args) args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil { if errStatus != nil {
return nil, errStatus return nil, errStatus
@ -754,17 +818,25 @@ func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *
query += " GROUP BY time ORDER BY time DESC" query += " GROUP BY time ORDER BY time DESC"
err := r.db.Select(ctx, &serviceOverviewItems, query, args...) err := r.db.Select(ctx, &serviceOverviewItems, query, args...)
zap.S().Info(query) zap.S().Debug(query)
if err != nil { if err != nil {
zap.S().Debug("Error in processing sql query: ", err) zap.S().Error("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
} }
serviceErrorItems := []model.ServiceErrorItem{} serviceErrorItems := []model.ServiceErrorItem{}
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %s minute) as time, count(*) as numErrors FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND serviceName='%s' AND hasError=true", strconv.Itoa(int(queryParams.StepSeconds/60)), r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName) query = fmt.Sprintf(`
SELECT
toStartOfInterval(timestamp, INTERVAL @interval minute) as time,
count(*) as numErrors
FROM %s.%s
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end AND statusCode=2`,
r.traceDB, r.indexTable,
)
args = []interface{}{} args = []interface{}{}
args = append(args, namedArgs...)
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args) args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil { if errStatus != nil {
return nil, errStatus return nil, errStatus
@ -772,10 +844,10 @@ func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *
query += " GROUP BY time ORDER BY time DESC" query += " GROUP BY time ORDER BY time DESC"
err = r.db.Select(ctx, &serviceErrorItems, query, args...) err = r.db.Select(ctx, &serviceErrorItems, query, args...)
zap.S().Info(query) zap.S().Debug(query)
if err != nil { if err != nil {
zap.S().Debug("Error in processing sql query: ", err) zap.S().Error("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
} }
@ -1516,45 +1588,67 @@ func (r *ClickHouseReader) GetTagValues(ctx context.Context, queryParams *model.
return &cleanedTagValues, nil return &cleanedTagValues, nil
} }
func (r *ClickHouseReader) GetTopEndpoints(ctx context.Context, queryParams *model.GetTopEndpointsParams) (*[]model.TopEndpointsItem, *model.ApiError) { func (r *ClickHouseReader) GetTopOperations(ctx context.Context, queryParams *model.GetTopOperationsParams) (*[]model.TopOperationsItem, *model.ApiError) {
var topEndpointsItems []model.TopEndpointsItem namedArgs := []interface{}{
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
clickhouse.Named("serviceName", queryParams.ServiceName),
}
query := fmt.Sprintf("SELECT quantile(0.5)(durationNano) as p50, quantile(0.95)(durationNano) as p95, quantile(0.99)(durationNano) as p99, COUNT(1) as numCalls, name FROM %s.%s WHERE timestamp >= '%s' AND timestamp <= '%s' AND kind='2' and serviceName='%s'", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName) var topOperationsItems []model.TopOperationsItem
query := fmt.Sprintf(`
SELECT
quantile(0.5)(durationNano) as p50,
quantile(0.95)(durationNano) as p95,
quantile(0.99)(durationNano) as p99,
COUNT(*) as numCalls,
name
FROM %s.%s
WHERE serviceName = @serviceName AND timestamp>= @start AND timestamp<= @end`,
r.traceDB, r.indexTable,
)
args := []interface{}{} args := []interface{}{}
args = append(args, namedArgs...)
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args) args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
if errStatus != nil { if errStatus != nil {
return nil, errStatus return nil, errStatus
} }
query += " GROUP BY name" query += " GROUP BY name ORDER BY p99 DESC LIMIT 10"
err := r.db.Select(ctx, &topEndpointsItems, query, args...) err := r.db.Select(ctx, &topOperationsItems, query, args...)
zap.S().Info(query) zap.S().Debug(query)
if err != nil { if err != nil {
zap.S().Debug("Error in processing sql query: ", err) zap.S().Error("Error in processing sql query: ", err)
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")} return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
} }
if topEndpointsItems == nil { if topOperationsItems == nil {
topEndpointsItems = []model.TopEndpointsItem{} topOperationsItems = []model.TopOperationsItem{}
} }
return &topEndpointsItems, nil return &topOperationsItems, nil
} }
func (r *ClickHouseReader) GetUsage(ctx context.Context, queryParams *model.GetUsageParams) (*[]model.UsageItem, error) { func (r *ClickHouseReader) GetUsage(ctx context.Context, queryParams *model.GetUsageParams) (*[]model.UsageItem, error) {
var usageItems []model.UsageItem var usageItems []model.UsageItem
namedArgs := []interface{}{
clickhouse.Named("interval", queryParams.StepHour),
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
}
var query string var query string
if len(queryParams.ServiceName) != 0 { if len(queryParams.ServiceName) != 0 {
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d HOUR) as time, count(1) as count FROM %s.%s WHERE serviceName='%s' AND timestamp>='%s' AND timestamp<='%s' GROUP BY time ORDER BY time ASC", queryParams.StepHour, r.traceDB, r.indexTable, queryParams.ServiceName, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10)) namedArgs = append(namedArgs, clickhouse.Named("serviceName", queryParams.ServiceName))
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL @interval HOUR) as time, sum(count) as count FROM %s.%s WHERE service_name=@serviceName AND timestamp>=@start AND timestamp<=@end GROUP BY time ORDER BY time ASC", r.traceDB, r.usageExplorerTable)
} else { } else {
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d HOUR) as time, count(1) as count FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' GROUP BY time ORDER BY time ASC", queryParams.StepHour, r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10)) query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL @interval HOUR) as time, sum(count) as count FROM %s.%s WHERE timestamp>=@start AND timestamp<=@end GROUP BY time ORDER BY time ASC", r.traceDB, r.usageExplorerTable)
} }
err := r.db.Select(ctx, &usageItems, query) err := r.db.Select(ctx, &usageItems, query, namedArgs...)
zap.S().Info(query) zap.S().Info(query)
@ -1614,48 +1708,50 @@ func interfaceArrayToStringArray(array []interface{}) []string {
return strArray return strArray
} }
func (r *ClickHouseReader) GetServiceMapDependencies(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceMapDependencyResponseItem, error) { func (r *ClickHouseReader) GetDependencyGraph(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceMapDependencyResponseItem, error) {
serviceMapDependencyItems := []model.ServiceMapDependencyItem{}
query := fmt.Sprintf(`SELECT spanID, parentSpanID, serviceName FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s'`, r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10)) response := []model.ServiceMapDependencyResponseItem{}
err := r.db.Select(ctx, &serviceMapDependencyItems, query) args := []interface{}{}
args = append(args,
clickhouse.Named("start", uint64(queryParams.Start.Unix())),
clickhouse.Named("end", uint64(queryParams.End.Unix())),
clickhouse.Named("duration", uint64(queryParams.End.Unix()-queryParams.Start.Unix())),
)
zap.S().Info(query) query := fmt.Sprintf(`
WITH
quantilesMergeState(0.5, 0.75, 0.9, 0.95, 0.99)(duration_quantiles_state) AS duration_quantiles_state,
finalizeAggregation(duration_quantiles_state) AS result
SELECT
src as parent,
dest as child,
result[1] AS p50,
result[2] AS p75,
result[3] AS p90,
result[4] AS p95,
result[5] AS p99,
sum(total_count) as callCount,
sum(total_count)/ @duration AS callRate,
sum(error_count)/sum(total_count) as errorRate
FROM %s.%s
WHERE toUInt64(toDateTime(timestamp)) >= @start AND toUInt64(toDateTime(timestamp)) <= @end
GROUP BY
src,
dest`,
r.traceDB, r.dependencyGraphTable,
)
zap.S().Debug(query, args)
err := r.db.Select(ctx, &response, query, args...)
if err != nil { if err != nil {
zap.S().Debug("Error in processing sql query: ", err) zap.S().Error("Error in processing sql query: ", err)
return nil, fmt.Errorf("Error in processing sql query") return nil, fmt.Errorf("Error in processing sql query")
} }
serviceMap := make(map[string]*model.ServiceMapDependencyResponseItem) return &response, nil
spanId2ServiceNameMap := make(map[string]string)
for i := range serviceMapDependencyItems {
spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId] = serviceMapDependencyItems[i].ServiceName
}
for i := range serviceMapDependencyItems {
parent2childServiceName := spanId2ServiceNameMap[serviceMapDependencyItems[i].ParentSpanId] + "-" + spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId]
if _, ok := serviceMap[parent2childServiceName]; !ok {
serviceMap[parent2childServiceName] = &model.ServiceMapDependencyResponseItem{
Parent: spanId2ServiceNameMap[serviceMapDependencyItems[i].ParentSpanId],
Child: spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId],
CallCount: 1,
}
} else {
serviceMap[parent2childServiceName].CallCount++
}
}
retMe := make([]model.ServiceMapDependencyResponseItem, 0, len(serviceMap))
for _, dependency := range serviceMap {
if dependency.Parent == "" {
continue
}
retMe = append(retMe, *dependency)
}
return &retMe, nil
} }
func (r *ClickHouseReader) GetFilteredSpansAggregates(ctx context.Context, queryParams *model.GetFilteredSpanAggregatesParams) (*model.GetFilteredSpansAggregatesResponse, *model.ApiError) { func (r *ClickHouseReader) GetFilteredSpansAggregates(ctx context.Context, queryParams *model.GetFilteredSpanAggregatesParams) (*model.GetFilteredSpansAggregatesResponse, *model.ApiError) {
@ -1895,7 +1991,7 @@ func (r *ClickHouseReader) SetTTL(ctx context.Context,
switch params.Type { switch params.Type {
case constants.TraceTTL: case constants.TraceTTL:
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable} tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable, signozTraceDBName + "." + signozUsageExplorerTable, signozTraceDBName + "." + defaultDependencyGraphTable}
for _, tableName = range tableNameArray { for _, tableName = range tableNameArray {
statusItem, err := r.checkTTLStatusItem(ctx, tableName) statusItem, err := r.checkTTLStatusItem(ctx, tableName)
if err != nil { if err != nil {
@ -2170,7 +2266,7 @@ func (r *ClickHouseReader) GetTTL(ctx context.Context, ttlParams *model.GetTTLPa
switch ttlParams.Type { switch ttlParams.Type {
case constants.TraceTTL: case constants.TraceTTL:
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable} tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable, signozTraceDBName + "." + signozUsageExplorerTable, signozTraceDBName + "." + defaultDependencyGraphTable}
status, err := r.setTTLQueryStatus(ctx, tableNameArray) status, err := r.setTTLQueryStatus(ctx, tableNameArray)
if err != nil { if err != nil {
return nil, err return nil, err

Some files were not shown because too many files have changed in this diff Show More