2
.github/CODEOWNERS
vendored
@ -4,4 +4,4 @@
|
||||
* @ankitnayan
|
||||
/frontend/ @palashgdev @pranshuchittora
|
||||
/deploy/ @prashant-shahi
|
||||
/pkg/query-service/ @srikanthccv @makeavish @nityanandagohain
|
||||
/pkg/query-service/ @srikanthccv
|
||||
|
406
CONTRIBUTING.md
@ -1,122 +1,331 @@
|
||||
# How to Contribute
|
||||
# Contributing Guidelines
|
||||
|
||||
There are primarily 2 areas in which you can contribute in SigNoz
|
||||
## Welcome to SigNoz Contributing section 🎉
|
||||
|
||||
- Frontend ( written in Typescript, React)
|
||||
- Backend - ( Query Service - written in Go)
|
||||
Hi there! We're thrilled that you'd like to contribute to this project, thank you for your interest. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community.
|
||||
|
||||
Depending upon your area of expertise & interest, you can chose one or more to contribute. Below are detailed instructions to contribute in each area
|
||||
Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution.
|
||||
|
||||
> Please note: If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. 🙏🏻
|
||||
- We accept contributions made to the [SigNoz `develop` branch]()
|
||||
- Find all SigNoz Docker Hub images here
|
||||
- [signoz/frontend](https://hub.docker.com/r/signoz/frontend)
|
||||
- [signoz/query-service](https://hub.docker.com/r/signoz/query-service)
|
||||
- [signoz/otelcontribcol](https://hub.docker.com/r/signoz/otelcontribcol)
|
||||
|
||||
> If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted.
|
||||
## Finding contributions to work on 💬
|
||||
|
||||
# Develop Frontend
|
||||
Looking at the existing issues is a great way to find something to contribute on.
|
||||
Also, have a look at these [good first issues label](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to start with.
|
||||
|
||||
Need to update [https://github.com/SigNoz/signoz/tree/main/frontend](https://github.com/SigNoz/signoz/tree/main/frontend)
|
||||
|
||||
### Contribute to Frontend with Docker installation of SigNoz
|
||||
## Sections:
|
||||
- [General Instructions](#1-general-instructions-)
|
||||
- [For Creating Issue(s)](#11-for-creating-issues)
|
||||
- [For Pull Requests(s)](#12-for-pull-requests)
|
||||
- [How to Contribute](#2-how-to-contribute-%EF%B8%8F)
|
||||
- [Develop Frontend](#3-develop-frontend-)
|
||||
- [Contribute to Frontend with Docker installation of SigNoz](#31-contribute-to-frontend-with-docker-installation-of-signoz)
|
||||
- [Contribute to Frontend without installing SigNoz backend](#32-contribute-to-frontend-without-installing-signoz-backend)
|
||||
- [Contribute to Backend (Query-Service)](#4-contribute-to-backend-query-service-)
|
||||
- [To run ClickHouse setup](#41-to-run-clickhouse-setup-recommended-for-local-development)
|
||||
- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart-)
|
||||
- [To run helm chart for local development](#51-to-run-helm-chart-for-local-development)
|
||||
- [Other Ways to Contribute](#other-ways-to-contribute)
|
||||
|
||||
- `git clone https://github.com/SigNoz/signoz.git && cd signoz`
|
||||
- comment out frontend service section at `deploy/docker/clickhouse-setup/docker-compose.yaml#L62`
|
||||
- run `cd deploy` to move to deploy directory
|
||||
- Install signoz locally without the frontend
|
||||
- Add below configuration to query-service section at `docker/clickhouse-setup/docker-compose.yaml#L38`
|
||||
# 1. General Instructions 📝
|
||||
|
||||
```docker
|
||||
## 1.1 For Creating Issue(s)
|
||||
Before making any significant changes and before filing a new issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed) issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can.
|
||||
|
||||
**Issue Types** - [Bug Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=) | [Feature Request](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=) | [Performance Issue Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=performance-issue-report.md&title=) | [Report a Security Vulnerability](https://github.com/SigNoz/signoz/security/policy)
|
||||
|
||||
#### Details like these are incredibly useful:
|
||||
|
||||
- **Requirement** - what kind of use case are you trying to solve?
|
||||
- **Proposal** - what do you suggest to solve the problem or improve the existing
|
||||
situation?
|
||||
- Any open questions to address❓
|
||||
|
||||
#### If you are reporting a bug, details like these are incredibly useful:
|
||||
|
||||
- A reproducible test case or series of steps.
|
||||
- The version of our code being used.
|
||||
- Any modifications you've made relevant to the bug🐞.
|
||||
- Anything unusual about your environment or deployment.
|
||||
|
||||
Discussing your proposed changes ahead of time will make the contribution
|
||||
process smooth for everyone 🙌.
|
||||
|
||||
**[`^top^`](#)**
|
||||
|
||||
<hr>
|
||||
|
||||
## 1.2 For Pull Request(s)
|
||||
|
||||
Contributions via pull requests are much appreciated. Once the approach is agreed upon ✅, make your changes and open a Pull Request(s).
|
||||
Before sending us a pull request, please ensure that,
|
||||
|
||||
- Fork the SigNoz repo on GitHub, clone it on your machine.
|
||||
- Create a branch with your changes.
|
||||
- You are working against the latest source on the `develop` branch.
|
||||
- Modify the source; please focus only on the specific change you are contributing.
|
||||
- Ensure local tests pass.
|
||||
- Commit to your fork using clear commit messages.
|
||||
- Send us a pull request, answering any default questions in the pull request interface.
|
||||
- Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation
|
||||
- Once you've pushed your commits to GitHub, make sure that your branch can be auto-merged (there are no merge conflicts). If not, on your computer, merge main into your branch, resolve any merge conflicts, make sure everything still runs correctly and passes all the tests, and then push up those changes.
|
||||
- Once the change has been approved and merged, we will inform you in a comment.
|
||||
|
||||
|
||||
GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and
|
||||
[creating a pull request](https://help.github.com/articles/creating-a-pull-request/).
|
||||
|
||||
**Note:** Unless your change is small, **please** consider submitting different Pull Rrequest(s):
|
||||
|
||||
* 1️⃣ First PR should include the overall structure of the new component:
|
||||
* Readme, configuration, interfaces or base classes, etc...
|
||||
* This PR is usually trivial to review, so the size limit does not apply to
|
||||
it.
|
||||
* 2️⃣ Second PR should include the concrete implementation of the component. If the
|
||||
size of this PR is larger than the recommended size, consider **splitting** ⚔️ it into
|
||||
multiple PRs.
|
||||
* If there are multiple sub-component then ideally each one should be implemented as
|
||||
a **separate** pull request.
|
||||
* Last PR should include changes to **any user-facing documentation.** And should include
|
||||
end-to-end tests if applicable. The component must be enabled
|
||||
only after sufficient testing, and there is enough confidence in the
|
||||
stability and quality of the component.
|
||||
|
||||
|
||||
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack).
|
||||
|
||||
### Pointers:
|
||||
- If you find any **bugs** → please create an [**issue.**](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=)
|
||||
- If you find anything **missing** in documentation → you can create an issue with the label **`documentation`**.
|
||||
- If you want to build any **new feature** → please create an [issue with the label **`enhancement`**.](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=)
|
||||
- If you want to **discuss** something about the product, start a new [**discussion**.](https://github.com/SigNoz/signoz/discussions)
|
||||
|
||||
<hr>
|
||||
|
||||
### Conventions to follow when submitting Commits and Pull Request(s).
|
||||
|
||||
We try to follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/), more specifically the commits and PRs **should have type specifiers** prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea.
|
||||
|
||||
e.g. If you are submitting a fix for an issue in frontend, the PR name should be prefixed with **`fix(FE):`**
|
||||
|
||||
- Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows.
|
||||
|
||||
- Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
|
||||
|
||||
**[`^top^`](#)**
|
||||
|
||||
<hr>
|
||||
|
||||
# 2. How to Contribute 🙋🏻♂️
|
||||
|
||||
#### There are primarily 2 areas in which you can contribute to SigNoz
|
||||
|
||||
- [**Frontend**](#3-develop-frontend-) (Written in Typescript, React)
|
||||
- [**Backend**](#4-contribute-to-backend-query-service-) (Query Service, written in Go)
|
||||
|
||||
Depending upon your area of expertise & interest, you can choose one or more to contribute. Below are detailed instructions to contribute in each area.
|
||||
|
||||
**Please note:** If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. 🙏🏻
|
||||
|
||||
⚠️ If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted.
|
||||
|
||||
**[`^top^`](#)**
|
||||
|
||||
<hr>
|
||||
|
||||
# 3. Develop Frontend 🌚
|
||||
|
||||
**Need to Update: [https://github.com/SigNoz/signoz/tree/develop/frontend](https://github.com/SigNoz/signoz/tree/develop/frontend)**
|
||||
|
||||
Also, have a look at [Frontend README.md](https://github.com/SigNoz/signoz/blob/develop/frontend/README.md) sections for more info on how to setup SigNoz frontend locally (with and without Docker).
|
||||
|
||||
## 3.1 Contribute to Frontend with Docker installation of SigNoz
|
||||
|
||||
- Clone the SigNoz repository and cd into signoz directory,
|
||||
```
|
||||
git clone https://github.com/SigNoz/signoz.git && cd signoz
|
||||
```
|
||||
- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68)
|
||||
|
||||

|
||||
|
||||
|
||||
- run `cd deploy` to move to deploy directory,
|
||||
- Install signoz locally **without** the frontend,
|
||||
- Add / Uncomment the below configuration to query-service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L47`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L47)
|
||||
```
|
||||
ports:
|
||||
- "8080:8080"
|
||||
```
|
||||
- If you are using x86_64 processors (All Intel/AMD processors) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d`
|
||||
- If you are on arm64 processors (Apple M1 Macbooks) run `sudo docker-compose -f docker/clickhouse-setup/docker-compose.arm.yaml up -d`
|
||||
- `cd ../frontend` and change baseURL to `http://localhost:8080` in file `src/constants/env.ts`
|
||||
- `yarn install`
|
||||
- `yarn dev`
|
||||
<img width="869" alt="query service" src="https://user-images.githubusercontent.com/52788043/179010251-8489be31-04ca-42f8-b30d-ef0bb6accb6b.png">
|
||||
|
||||
- Next run,
|
||||
```
|
||||
sudo docker-compose -f docker/clickhouse-setup/docker-compose.yaml up -d
|
||||
```
|
||||
- `cd ../frontend` and change baseURL in file [`frontend/src/constants/env.ts#L2`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts#L2) and for that, you need to create a `.env` file in the `frontend` directory with the following environment variable (`FRONTEND_API_ENDPOINT`) matching your configuration.
|
||||
|
||||
> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh`
|
||||
If you have backend api exposed via frontend nginx:
|
||||
```
|
||||
FRONTEND_API_ENDPOINT=http://localhost:3301
|
||||
```
|
||||
If not:
|
||||
```
|
||||
FRONTEND_API_ENDPOINT=http://localhost:8080
|
||||
```
|
||||
|
||||
### Contribute to Frontend without installing SigNoz backend
|
||||
- Next,
|
||||
```
|
||||
yarn install
|
||||
yarn dev
|
||||
```
|
||||
|
||||
If you don't want to install SigNoz backend just for doing frontend development, we can provide you with test environments which you can use as the backend. Please ping us in #contributing channel in our [slack community](https://signoz.io/slack) and we will DM you with `<test environment URL>`
|
||||
### Important Notes:
|
||||
The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh)
|
||||
|
||||
- `git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend`
|
||||
- Create a file `.env` with `FRONTEND_API_ENDPOINT=<test environment URL>`
|
||||
- `yarn install`
|
||||
- `yarn dev`
|
||||
**[`^top^`](#)**
|
||||
|
||||
**_Frontend should now be accessible at `http://localhost:3301/application`_**
|
||||
## 3.2 Contribute to Frontend without installing SigNoz backend
|
||||
|
||||
# Contribute to Query-Service
|
||||
If you don't want to install the SigNoz backend just for doing frontend development, we can provide you with test environments that you can use as the backend.
|
||||
|
||||
Need to update [https://github.com/SigNoz/signoz/tree/main/pkg/query-service](https://github.com/SigNoz/signoz/tree/main/pkg/query-service)
|
||||
- Clone the SigNoz repository and cd into signoz/frontend directory,
|
||||
```
|
||||
git clone https://github.com/SigNoz/signoz.git && cd signoz/frontend
|
||||
````
|
||||
- Create a file `.env` in the `frontend` directory with `FRONTEND_API_ENDPOINT=<test environment URL>`
|
||||
- Next,
|
||||
```
|
||||
yarn install
|
||||
yarn dev
|
||||
```
|
||||
|
||||
### To run ClickHouse setup (recommended for local development)
|
||||
Please ping us in the [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) channel or ask `@Prashant Shahi` in our [Slack Community](https://signoz.io/slack) and we will DM you with `<test environment URL>`.
|
||||
|
||||
- git clone https://github.com/SigNoz/signoz.git
|
||||
- run `cd signoz` to move to signoz directory
|
||||
- run `sudo make dev-setup` to configure local setup to run query-service
|
||||
- comment out frontend service section at `docker/clickhouse-setup/docker-compose.yaml`
|
||||
- comment out query-service section at `docker/clickhouse-setup/docker-compose.yaml`
|
||||
- add below configuration to clickhouse section at `docker/clickhouse-setup/docker-compose.yaml`
|
||||
```docker
|
||||
expose:
|
||||
- 9000
|
||||
ports:
|
||||
- 9001:9000
|
||||
**Frontend should now be accessible at** [`http://localhost:3301/application`](http://localhost:3301/application)
|
||||
|
||||
**[`^top^`](#)**
|
||||
|
||||
<hr>
|
||||
|
||||
# 4. Contribute to Backend (Query-Service) 🌑
|
||||
|
||||
[**https://github.com/SigNoz/signoz/tree/develop/pkg/query-service**](https://github.com/SigNoz/signoz/tree/develop/pkg/query-service)
|
||||
|
||||
## 4.1 To run ClickHouse setup (recommended for local development)
|
||||
|
||||
- Clone the SigNoz repository and cd into signoz directory,
|
||||
```
|
||||
git clone https://github.com/SigNoz/signoz.git && cd signoz
|
||||
```
|
||||
- run `sudo make dev-setup` to configure local setup to run query-service,
|
||||
- Comment out `frontend` service section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L68`](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L68)
|
||||
<img width="982" alt="develop-frontend" src="https://user-images.githubusercontent.com/52788043/179043977-012be8b0-a2ed-40d1-b2e6-2ab72d7989c0.png">
|
||||
|
||||
- Comment out `query-service` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml#L41`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml#L41)
|
||||
<img width="1068" alt="Screenshot 2022-07-14 at 22 48 07" src="https://user-images.githubusercontent.com/52788043/179044151-a65ba571-db0b-4a16-b64b-ca3fadcf3af0.png">
|
||||
|
||||
- add below configuration to `clickhouse` section at [`deploy/docker/clickhouse-setup/docker-compose.yaml`,](https://github.com/SigNoz/signoz/blob/develop/deploy/docker/clickhouse-setup/docker-compose.yaml)
|
||||
```
|
||||
ports:
|
||||
- 9001:9000
|
||||
```
|
||||
<img width="1013" alt="Screenshot 2022-07-14 at 22 50 37" src="https://user-images.githubusercontent.com/52788043/179044544-a293d3bc-4c4f-49ea-a276-505a381de67d.png">
|
||||
|
||||
- run `cd pkg/query-service/` to move to `query-service` directory,
|
||||
- Then, you need to create a `.env` file with the following environment variable
|
||||
```
|
||||
SIGNOZ_LOCAL_DB_PATH="./signoz.db"
|
||||
```
|
||||
to set your local environment with the right `RELATIONAL_DATASOURCE_PATH` as mentioned in [`./constants/constants.go#L38`,](https://github.com/SigNoz/signoz/blob/develop/pkg/query-service/constants/constants.go#L38)
|
||||
|
||||
- Now, install SigNoz locally **without** the `frontend` and `query-service`,
|
||||
- If you are using `x86_64` processors (All Intel/AMD processors) run `sudo make run-x86`
|
||||
- If you are on `arm64` processors (Apple M1 Macs) run `sudo make run-arm`
|
||||
|
||||
#### Run locally,
|
||||
```
|
||||
|
||||
- run `cd pkg/query-service/` to move to query-service directory
|
||||
- Open ./constants/constants.go
|
||||
- Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \
|
||||
with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".```
|
||||
|
||||
- Install signoz locally without the frontend and query-service
|
||||
- If you are using x86_64 processors (All Intel/AMD processors) run `sudo make run-x86`
|
||||
- If you are on arm64 processors (Apple M1 Macbooks) run `sudo make run-arm`
|
||||
|
||||
#### Run locally
|
||||
```console
|
||||
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse go run main.go
|
||||
```
|
||||
|
||||
> Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh`
|
||||
#### Build and Run locally
|
||||
```
|
||||
cd pkg/query-service
|
||||
go build -o build/query-service main.go
|
||||
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service
|
||||
```
|
||||
|
||||
**_Query Service should now be available at `http://localhost:8080`_**
|
||||
#### Docker Images
|
||||
The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service
|
||||
|
||||
> If you want to see how, frontend plays with query service, you can run frontend also in you local env with the baseURL changed to `http://localhost:8080` in file `src/constants/env.ts` as the query-service is now running at port `8080`
|
||||
```
|
||||
docker pull signoz/query-service
|
||||
```
|
||||
|
||||
```
|
||||
docker pull signoz/query-service:latest
|
||||
```
|
||||
|
||||
```
|
||||
docker pull signoz/query-service:develop
|
||||
```
|
||||
|
||||
### Important Note:
|
||||
The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh)
|
||||
|
||||
|
||||
|
||||
**Query Service should now be available at** [`http://localhost:8080`](http://localhost:8080)
|
||||
|
||||
If you want to see how the frontend plays with query service, you can run the frontend also in your local env with the baseURL changed to `http://localhost:8080` in file [`frontend/src/constants/env.ts`](https://github.com/SigNoz/signoz/blob/develop/frontend/src/constants/env.ts) as the `query-service` is now running at port `8080`.
|
||||
|
||||
---
|
||||
<!-- Instead of configuring a local setup, you can also use [Gitpod](https://www.gitpod.io/), a VSCode-based Web IDE.
|
||||
|
||||
Click the button below. A workspace with all required environments will be created.
|
||||
|
||||
[](https://gitpod.io/#https://github.com/SigNoz/signoz)
|
||||
|
||||
> To use it on your forked repo, edit the 'Open in Gitpod' button url to `https://gitpod.io/#https://github.com/<your-github-username>/signoz` -->
|
||||
> To use it on your forked repo, edit the 'Open in Gitpod' button URL to `https://gitpod.io/#https://github.com/<your-github-username>/signoz` -->
|
||||
|
||||
# Contribute to SigNoz Helm Chart
|
||||
**[`^top^`](#)**
|
||||
|
||||
<hr>
|
||||
|
||||
Need to update [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts).
|
||||
# 5. Contribute to SigNoz Helm Chart 📊
|
||||
|
||||
### To run helm chart for local development
|
||||
**Need to Update: [https://github.com/SigNoz/charts](https://github.com/SigNoz/charts).**
|
||||
|
||||
- run `git clone https://github.com/SigNoz/charts.git` followed by `cd charts`
|
||||
- it is recommended to use lightweight kubernetes (k8s) cluster for local development:
|
||||
## 5.1 To run helm chart for local development
|
||||
|
||||
- Clone the SigNoz repository and cd into charts directory,
|
||||
```
|
||||
git clone https://github.com/SigNoz/charts.git && cd charts
|
||||
```
|
||||
- It is recommended to use lightweight kubernetes (k8s) cluster for local development:
|
||||
- [kind](https://kind.sigs.k8s.io/docs/user/quick-start/#installation)
|
||||
- [k3d](https://k3d.io/#installation)
|
||||
- [minikube](https://minikube.sigs.k8s.io/docs/start/)
|
||||
- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster
|
||||
- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace.
|
||||
- run `kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301` to make SigNoz UI available at [localhost:3301](http://localhost:3301)
|
||||
- create a k8s cluster and make sure `kubectl` points to the locally created k8s cluster,
|
||||
- run `make dev-install` to install SigNoz chart with `my-release` release name in `platform` namespace,
|
||||
- next run,
|
||||
```
|
||||
kubectl -n platform port-forward svc/my-release-signoz-frontend 3301:3301
|
||||
```
|
||||
to make SigNoz UI available at [localhost:3301](http://localhost:3301)
|
||||
|
||||
**To install HotROD sample app:**
|
||||
**5.1.1 To install the HotROD sample app:**
|
||||
|
||||
```bash
|
||||
curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-install.sh \
|
||||
| HELM_RELEASE=my-release SIGNOZ_NAMESPACE=platform bash
|
||||
```
|
||||
|
||||
**To load data with HotROD sample app:**
|
||||
**5.1.2 To load data with the HotROD sample app:**
|
||||
|
||||
```bash
|
||||
kubectl -n sample-application run strzal --image=djbingham/curl \
|
||||
@ -124,7 +333,7 @@ kubectl -n sample-application run strzal --image=djbingham/curl \
|
||||
'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm
|
||||
```
|
||||
|
||||
**To stop the load generation:**
|
||||
**5.1.3 To stop the load generation:**
|
||||
|
||||
```bash
|
||||
kubectl -n sample-application run strzal --image=djbingham/curl \
|
||||
@ -132,59 +341,32 @@ kubectl -n sample-application run strzal --image=djbingham/curl \
|
||||
http://locust-master:8089/stop
|
||||
```
|
||||
|
||||
**To delete HotROD sample app:**
|
||||
**5.1.4 To delete the HotROD sample app:**
|
||||
|
||||
```bash
|
||||
curl -sL https://github.com/SigNoz/signoz/raw/main/sample-apps/hotrod/hotrod-delete.sh \
|
||||
| HOTROD_NAMESPACE=sample-application bash
|
||||
```
|
||||
|
||||
**[`^top^`](#)**
|
||||
|
||||
---
|
||||
|
||||
## General Instructions
|
||||
## Other Ways to Contribute
|
||||
|
||||
**Before making any significant changes, please open an issue**. Each issue
|
||||
should describe the following:
|
||||
There are many other ways to get involved with the community and to participate in this project:
|
||||
|
||||
* Requirement - what kind of use case are you trying to solve?
|
||||
* Proposal - what do you suggest to solve the problem or improve the existing
|
||||
situation?
|
||||
* Any open questions to address
|
||||
|
||||
Discussing your proposed changes ahead of time will make the contribution
|
||||
process smooth for everyone. Once the approach is agreed upon, make your changes
|
||||
and open a pull request(s). Unless your change is small, Please consider submitting different PRs:
|
||||
|
||||
* First PR should include the overall structure of the new component:
|
||||
* Readme, configuration, interfaces or base classes etc...
|
||||
* This PR is usually trivial to review, so the size limit does not apply to
|
||||
it.
|
||||
* Second PR should include the concrete implementation of the component. If the
|
||||
size of this PR is larger than the recommended size consider splitting it in
|
||||
multiple PRs.
|
||||
* If there are multiple sub-component then ideally each one should be implemented as
|
||||
a separate pull request.
|
||||
* Last PR should include changes to any user facing documentation. And should include
|
||||
end to end tests if applicable. The component must be enabled
|
||||
only after sufficient testing, and there is enough confidence in the
|
||||
stability and quality of the component.
|
||||
- Use the product, submitting GitHub issues when a problem is found.
|
||||
- Help code review pull requests and participate in issue threads.
|
||||
- Submit a new feature request as an issue.
|
||||
- Help answer questions on forums such as Stack Overflow and [SigNoz Community Slack Channel](https://signoz.io/slack).
|
||||
- Tell others about the project on Twitter, your blog, etc.
|
||||
|
||||
|
||||
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [slack](https://signoz.io/slack).
|
||||
## License
|
||||
|
||||
- If you find any bugs, please create an issue
|
||||
- If you find anything missing in documentation, you can create an issue with label **documentation**
|
||||
- If you want to build any new feature, please create an issue with label `enhancement`
|
||||
- If you want to discuss something about the product, start a new [discussion](https://github.com/SigNoz/signoz/discussions)
|
||||
By contributing to SigNoz, you agree that your contributions will be licensed under its MIT license.
|
||||
|
||||
### Conventions to follow when submitting commits, PRs
|
||||
Again, Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
|
||||
|
||||
1. We try to follow https://www.conventionalcommits.org/en/v1.0.0/
|
||||
|
||||
More specifically the commits and PRs should have type specifiers prefixed in the name. [This](https://www.conventionalcommits.org/en/v1.0.0/#specification) should give you a better idea.
|
||||
|
||||
e.g. If you are submitting a fix for an issue in frontend - PR name should be prefixed with `fix(FE):`
|
||||
|
||||
2. Follow [GitHub Flow](https://guides.github.com/introduction/flow/) guidelines for your contribution flows
|
||||
|
||||
3. Feel free to ping us on `#contributing` or `#contributing-frontend` on our slack community if you need any help on this :)
|
||||
Thank You!
|
||||
|
@ -20,6 +20,7 @@
|
||||
</default>
|
||||
<s3>
|
||||
<disk>s3</disk>
|
||||
<perform_ttl_move_on_insert>0</perform_ttl_move_on_insert>
|
||||
</s3>
|
||||
</volumes>
|
||||
</tiered>
|
||||
|
@ -27,7 +27,7 @@ services:
|
||||
retries: 3
|
||||
|
||||
alertmanager:
|
||||
image: signoz/alertmanager:0.23.0-0.1
|
||||
image: signoz/alertmanager:0.23.0-0.2
|
||||
volumes:
|
||||
- ./data/alertmanager:/data
|
||||
command:
|
||||
@ -40,7 +40,7 @@ services:
|
||||
condition: on-failure
|
||||
|
||||
query-service:
|
||||
image: signoz/query-service:0.10.0
|
||||
image: signoz/query-service:0.10.1
|
||||
command: ["-config=/root/config/prometheus.yml"]
|
||||
# ports:
|
||||
# - "6060:6060" # pprof port
|
||||
@ -68,7 +68,7 @@ services:
|
||||
- clickhouse
|
||||
|
||||
frontend:
|
||||
image: signoz/frontend:0.10.0
|
||||
image: signoz/frontend:0.10.1
|
||||
deploy:
|
||||
restart_policy:
|
||||
condition: on-failure
|
||||
@ -81,7 +81,7 @@ services:
|
||||
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
|
||||
|
||||
otel-collector:
|
||||
image: signoz/otelcontribcol:0.45.1-1.1
|
||||
image: signoz/otelcontribcol:0.45.1-1.3
|
||||
command: ["--config=/etc/otel-collector-config.yaml"]
|
||||
volumes:
|
||||
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
|
||||
@ -111,7 +111,7 @@ services:
|
||||
- clickhouse
|
||||
|
||||
otel-collector-metrics:
|
||||
image: signoz/otelcontribcol:0.45.1-1.1
|
||||
image: signoz/otelcontribcol:0.45.1-1.3
|
||||
command: ["--config=/etc/otel-collector-metrics-config.yaml"]
|
||||
volumes:
|
||||
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml
|
||||
|
@ -5,9 +5,11 @@ receivers:
|
||||
# otel-collector internal metrics
|
||||
- job_name: "otel-collector"
|
||||
scrape_interval: 60s
|
||||
static_configs:
|
||||
- targets:
|
||||
- otel-collector:8888
|
||||
dns_sd_configs:
|
||||
- names:
|
||||
- 'tasks.otel-collector'
|
||||
type: 'A'
|
||||
port: 8888
|
||||
# otel-collector-metrics internal metrics
|
||||
- job_name: "otel-collector-metrics"
|
||||
scrape_interval: 60s
|
||||
@ -17,9 +19,11 @@ receivers:
|
||||
# SigNoz span metrics
|
||||
- job_name: "signozspanmetrics-collector"
|
||||
scrape_interval: 60s
|
||||
static_configs:
|
||||
- targets:
|
||||
- otel-collector:8889
|
||||
dns_sd_configs:
|
||||
- names:
|
||||
- 'tasks.otel-collector'
|
||||
type: 'A'
|
||||
port: 8889
|
||||
|
||||
processors:
|
||||
batch:
|
||||
|
@ -20,6 +20,7 @@
|
||||
</default>
|
||||
<s3>
|
||||
<disk>s3</disk>
|
||||
<perform_ttl_move_on_insert>0</perform_ttl_move_on_insert>
|
||||
</s3>
|
||||
</volumes>
|
||||
</tiered>
|
||||
|
@ -25,7 +25,7 @@ services:
|
||||
retries: 3
|
||||
|
||||
alertmanager:
|
||||
image: signoz/alertmanager:0.23.0-0.1
|
||||
image: signoz/alertmanager:0.23.0-0.2
|
||||
volumes:
|
||||
- ./data/alertmanager:/data
|
||||
depends_on:
|
||||
@ -39,7 +39,7 @@ services:
|
||||
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
|
||||
|
||||
query-service:
|
||||
image: signoz/query-service:0.10.0
|
||||
image: signoz/query-service:0.10.1
|
||||
container_name: query-service
|
||||
command: ["-config=/root/config/prometheus.yml"]
|
||||
# ports:
|
||||
@ -66,7 +66,7 @@ services:
|
||||
condition: service_healthy
|
||||
|
||||
frontend:
|
||||
image: signoz/frontend:0.10.0
|
||||
image: signoz/frontend:0.10.1
|
||||
container_name: frontend
|
||||
restart: on-failure
|
||||
depends_on:
|
||||
@ -78,7 +78,7 @@ services:
|
||||
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
|
||||
|
||||
otel-collector:
|
||||
image: signoz/otelcontribcol:0.45.1-1.1
|
||||
image: signoz/otelcontribcol:0.45.1-1.3
|
||||
command: ["--config=/etc/otel-collector-config.yaml"]
|
||||
volumes:
|
||||
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
|
||||
@ -103,7 +103,7 @@ services:
|
||||
condition: service_healthy
|
||||
|
||||
otel-collector-metrics:
|
||||
image: signoz/otelcontribcol:0.45.1-1.1
|
||||
image: signoz/otelcontribcol:0.45.1-1.3
|
||||
command: ["--config=/etc/otel-collector-metrics-config.yaml"]
|
||||
volumes:
|
||||
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml
|
||||
|
@ -204,9 +204,14 @@ start_docker() {
|
||||
echo "Starting docker service"
|
||||
$sudo_cmd systemctl start docker.service
|
||||
fi
|
||||
# if [[ -z $sudo_cmd ]]; then
|
||||
# docker ps > /dev/null && true
|
||||
# if [[ $? -ne 0 ]]; then
|
||||
# request_sudo
|
||||
# fi
|
||||
# fi
|
||||
if [[ -z $sudo_cmd ]]; then
|
||||
docker ps > /dev/null && true
|
||||
if [[ $? -ne 0 ]]; then
|
||||
if ! docker ps > /dev/null && true; then
|
||||
request_sudo
|
||||
fi
|
||||
fi
|
||||
@ -268,8 +273,12 @@ request_sudo() {
|
||||
if (( $EUID != 0 )); then
|
||||
sudo_cmd="sudo"
|
||||
echo -e "Please enter your sudo password, if prompt."
|
||||
$sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null
|
||||
if [[ $? -ne 0 ]] && ! $sudo_cmd -v; then
|
||||
# $sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null
|
||||
# if [[ $? -ne 0 ]] && ! $sudo_cmd -v; then
|
||||
# echo "Need sudo privileges to proceed with the installation."
|
||||
# exit 1;
|
||||
# fi
|
||||
if ! $sudo_cmd -l | grep -e "NOPASSWD: ALL" > /dev/null && ! $sudo_cmd -v; then
|
||||
echo "Need sudo privileges to proceed with the installation."
|
||||
exit 1;
|
||||
fi
|
||||
@ -303,8 +312,13 @@ echo -e "🌏 Detecting your OS ...\n"
|
||||
check_os
|
||||
|
||||
# Obtain unique installation id
|
||||
sysinfo="$(uname -a)"
|
||||
if [[ $? -ne 0 ]]; then
|
||||
# sysinfo="$(uname -a)"
|
||||
# if [[ $? -ne 0 ]]; then
|
||||
# uuid="$(uuidgen)"
|
||||
# uuid="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
|
||||
# sysinfo="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
|
||||
# fi
|
||||
if ! sysinfo="$(uname -a)"; then
|
||||
uuid="$(uuidgen)"
|
||||
uuid="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
|
||||
sysinfo="${uuid:-$(cat /proc/sys/kernel/random/uuid)}"
|
||||
|
@ -16,6 +16,7 @@
|
||||
"playwright": "NODE_ENV=testing playwright test --config=./playwright.config.ts",
|
||||
"playwright:local:debug": "PWDEBUG=console yarn playwright --headed --browser=chromium",
|
||||
"playwright:codegen:local":"playwright codegen http://localhost:3301",
|
||||
"playwright:codegen:local:auth":"yarn playwright:codegen:local --load-storage=tests/auth.json",
|
||||
"husky:configure": "cd .. && husky install frontend/.husky && cd frontend && chmod ug+x .husky/*",
|
||||
"commitlint": "commitlint --edit $1"
|
||||
},
|
||||
|
@ -14,8 +14,8 @@ const config: PlaywrightTestConfig = {
|
||||
baseURL: process.env.PLAYWRIGHT_TEST_BASE_URL || 'http://localhost:3301',
|
||||
},
|
||||
updateSnapshots: 'all',
|
||||
fullyParallel: false,
|
||||
quiet: true,
|
||||
fullyParallel: !!process.env.CI,
|
||||
quiet: false,
|
||||
testMatch: ['**/*.spec.ts'],
|
||||
reporter: process.env.CI ? 'github' : 'list',
|
||||
};
|
||||
|
@ -1,4 +1,11 @@
|
||||
{
|
||||
"target_missing": "Please enter a threshold to proceed",
|
||||
"rule_test_fired": "Test notification sent successfully",
|
||||
"no_alerts_found": "No alerts found during the evaluation. This happens when rule condition is unsatisfied. You may adjust the rule threshold and retry.",
|
||||
"button_testrule": "Test Notification",
|
||||
"label_channel_select": "Notification Channels",
|
||||
"placeholder_channel_select": "select one or more channels",
|
||||
"channel_select_tooltip": "Leave empty to send this alert on all the configured channels",
|
||||
"preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.",
|
||||
"preview_chart_threshold_label": "Threshold",
|
||||
"placeholder_label_key_pair": "Click here to enter a label (key value pairs)",
|
||||
|
@ -1,4 +1,14 @@
|
||||
{
|
||||
"channel_delete_unexp_error": "Something went wrong",
|
||||
"channel_delete_success": "Channel Deleted Successfully",
|
||||
"column_channel_name": "Name",
|
||||
"column_channel_type": "Type",
|
||||
"column_channel_action": "Action",
|
||||
"column_channel_edit": "Edit",
|
||||
"button_new_channel": "New Alert Channel",
|
||||
"tooltip_notification_channels": "More details on how to setting notification channels",
|
||||
"sending_channels_note": "The alerts will be sent to all the configured channels.",
|
||||
"loading_channels_message": "Loading Channels..",
|
||||
"page_title_create": "New Notification Channels",
|
||||
"page_title_edit": "Edit Notification Channels",
|
||||
"button_save_channel": "Save",
|
||||
|
@ -1,4 +1,11 @@
|
||||
{
|
||||
"target_missing": "Please enter a threshold to proceed",
|
||||
"rule_test_fired": "Test notification sent successfully",
|
||||
"no_alerts_found": "No alerts found during the evaluation. This happens when rule condition is unsatisfied. You may adjust the rule threshold and retry.",
|
||||
"button_testrule": "Test Notification",
|
||||
"label_channel_select": "Notification Channels",
|
||||
"placeholder_channel_select": "select one or more channels",
|
||||
"channel_select_tooltip": "Leave empty to send this alert on all the configured channels",
|
||||
"preview_chart_unexpected_error": "An unexpeced error occurred updating the chart, please check your query.",
|
||||
"preview_chart_threshold_label": "Threshold",
|
||||
"placeholder_label_key_pair": "Click here to enter a label (key value pairs)",
|
||||
|
@ -1,4 +1,14 @@
|
||||
{
|
||||
"channel_delete_unexp_error": "Something went wrong",
|
||||
"channel_delete_success": "Channel Deleted Successfully",
|
||||
"column_channel_name": "Name",
|
||||
"column_channel_type": "Type",
|
||||
"column_channel_action": "Action",
|
||||
"column_channel_edit": "Edit",
|
||||
"button_new_channel": "New Alert Channel",
|
||||
"tooltip_notification_channels": "More details on how to setting notification channels",
|
||||
"sending_channels_note": "The alerts will be sent to all the configured channels.",
|
||||
"loading_channels_message": "Loading Channels..",
|
||||
"page_title_create": "New Notification Channels",
|
||||
"page_title_edit": "Edit Notification Channels",
|
||||
"button_save_channel": "Save",
|
||||
|
26
frontend/src/api/alerts/patch.ts
Normal file
@ -0,0 +1,26 @@
|
||||
import axios from 'api';
|
||||
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
|
||||
import { AxiosError } from 'axios';
|
||||
import { ErrorResponse, SuccessResponse } from 'types/api';
|
||||
import { PayloadProps, Props } from 'types/api/alerts/patch';
|
||||
|
||||
const patch = async (
|
||||
props: Props,
|
||||
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
|
||||
try {
|
||||
const response = await axios.patch(`/rules/${props.id}`, {
|
||||
...props.data,
|
||||
});
|
||||
|
||||
return {
|
||||
statusCode: 200,
|
||||
error: null,
|
||||
message: response.data.status,
|
||||
payload: response.data.data,
|
||||
};
|
||||
} catch (error) {
|
||||
return ErrorResponseHandler(error as AxiosError);
|
||||
}
|
||||
};
|
||||
|
||||
export default patch;
|
26
frontend/src/api/alerts/testAlert.ts
Normal file
@ -0,0 +1,26 @@
|
||||
import axios from 'api';
|
||||
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
|
||||
import { AxiosError } from 'axios';
|
||||
import { ErrorResponse, SuccessResponse } from 'types/api';
|
||||
import { PayloadProps, Props } from 'types/api/alerts/testAlert';
|
||||
|
||||
const testAlert = async (
|
||||
props: Props,
|
||||
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
|
||||
try {
|
||||
const response = await axios.post('/testRule', {
|
||||
...props.data,
|
||||
});
|
||||
|
||||
return {
|
||||
statusCode: 200,
|
||||
error: null,
|
||||
message: response.data.status,
|
||||
payload: response.data.data,
|
||||
};
|
||||
} catch (error) {
|
||||
return ErrorResponseHandler(error as AxiosError);
|
||||
}
|
||||
};
|
||||
|
||||
export default testAlert;
|
24
frontend/src/api/metrics/getTopLevelOperations.ts
Normal file
@ -0,0 +1,24 @@
|
||||
import axios from 'api';
|
||||
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
|
||||
import { AxiosError } from 'axios';
|
||||
import { ErrorResponse, SuccessResponse } from 'types/api';
|
||||
import { PayloadProps, Props } from 'types/api/metrics/getTopLevelOperations';
|
||||
|
||||
const getTopLevelOperations = async (
|
||||
props: Props,
|
||||
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
|
||||
try {
|
||||
const response = await axios.post(`/service/top_level_operations`);
|
||||
|
||||
return {
|
||||
statusCode: 200,
|
||||
error: null,
|
||||
message: response.data.status,
|
||||
payload: response.data[props.service],
|
||||
};
|
||||
} catch (error) {
|
||||
return ErrorResponseHandler(error as AxiosError);
|
||||
}
|
||||
};
|
||||
|
||||
export default getTopLevelOperations;
|
@ -2,13 +2,13 @@ import axios from 'api';
|
||||
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
|
||||
import { AxiosError } from 'axios';
|
||||
import { ErrorResponse, SuccessResponse } from 'types/api';
|
||||
import { PayloadProps, Props } from 'types/api/metrics/getTopEndPoints';
|
||||
import { PayloadProps, Props } from 'types/api/metrics/getTopOperations';
|
||||
|
||||
const getTopEndPoints = async (
|
||||
const getTopOperations = async (
|
||||
props: Props,
|
||||
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
|
||||
try {
|
||||
const response = await axios.post(`/service/top_endpoints`, {
|
||||
const response = await axios.post(`/service/top_operations`, {
|
||||
start: `${props.start}`,
|
||||
end: `${props.end}`,
|
||||
service: props.service,
|
||||
@ -26,4 +26,4 @@ const getTopEndPoints = async (
|
||||
}
|
||||
};
|
||||
|
||||
export default getTopEndPoints;
|
||||
export default getTopOperations;
|
@ -5,6 +5,7 @@ import ROUTES from 'constants/routes';
|
||||
import useComponentPermission from 'hooks/useComponentPermission';
|
||||
import history from 'lib/history';
|
||||
import React, { useCallback, useState } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { useSelector } from 'react-redux';
|
||||
import { generatePath } from 'react-router-dom';
|
||||
import { AppState } from 'store/reducers';
|
||||
@ -14,6 +15,7 @@ import AppReducer from 'types/reducer/app';
|
||||
import Delete from './Delete';
|
||||
|
||||
function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
|
||||
const { t } = useTranslation(['channels']);
|
||||
const [notifications, Element] = notification.useNotification();
|
||||
const [channels, setChannels] = useState<Channels[]>(allChannels);
|
||||
const { role } = useSelector<AppState, AppReducer>((state) => state.app);
|
||||
@ -29,12 +31,12 @@ function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
|
||||
|
||||
const columns: ColumnsType<Channels> = [
|
||||
{
|
||||
title: 'Name',
|
||||
title: t('column_channel_name'),
|
||||
dataIndex: 'name',
|
||||
key: 'name',
|
||||
},
|
||||
{
|
||||
title: 'Type',
|
||||
title: t('column_channel_type'),
|
||||
dataIndex: 'type',
|
||||
key: 'type',
|
||||
},
|
||||
@ -42,14 +44,14 @@ function AlertChannels({ allChannels }: AlertChannelsProps): JSX.Element {
|
||||
|
||||
if (action) {
|
||||
columns.push({
|
||||
title: 'Action',
|
||||
title: t('column_channel_action'),
|
||||
dataIndex: 'id',
|
||||
key: 'action',
|
||||
align: 'center',
|
||||
render: (id: string): JSX.Element => (
|
||||
<>
|
||||
<Button onClick={(): void => onClickEditHandler(id)} type="link">
|
||||
Edit
|
||||
{t('column_channel_edit')}
|
||||
</Button>
|
||||
<Delete id={id} setChannels={setChannels} notifications={notifications} />
|
||||
</>
|
||||
|
@ -1,29 +1,31 @@
|
||||
import { Button } from 'antd';
|
||||
import { NotificationInstance } from 'antd/lib/notification';
|
||||
import deleteAlert from 'api/channels/delete';
|
||||
import deleteChannel from 'api/channels/delete';
|
||||
import React, { useState } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { Channels } from 'types/api/channels/getAll';
|
||||
|
||||
function Delete({ notifications, setChannels, id }: DeleteProps): JSX.Element {
|
||||
const { t } = useTranslation(['channels']);
|
||||
const [loading, setLoading] = useState(false);
|
||||
|
||||
const onClickHandler = async (): Promise<void> => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const response = await deleteAlert({
|
||||
const response = await deleteChannel({
|
||||
id,
|
||||
});
|
||||
|
||||
if (response.statusCode === 200) {
|
||||
notifications.success({
|
||||
message: 'Success',
|
||||
description: 'Channel Deleted Successfully',
|
||||
description: t('channel_delete_success'),
|
||||
});
|
||||
setChannels((preChannels) => preChannels.filter((e) => e.id !== id));
|
||||
} else {
|
||||
notifications.error({
|
||||
message: 'Error',
|
||||
description: response.error || 'Something went wrong',
|
||||
description: response.error || t('channel_delete_unexp_error'),
|
||||
});
|
||||
}
|
||||
setLoading(false);
|
||||
@ -31,7 +33,9 @@ function Delete({ notifications, setChannels, id }: DeleteProps): JSX.Element {
|
||||
notifications.error({
|
||||
message: 'Error',
|
||||
description:
|
||||
error instanceof Error ? error.toString() : 'Something went wrong',
|
||||
error instanceof Error
|
||||
? error.toString()
|
||||
: t('channel_delete_unexp_error'),
|
||||
});
|
||||
setLoading(false);
|
||||
}
|
||||
|
@ -8,16 +8,18 @@ import useComponentPermission from 'hooks/useComponentPermission';
|
||||
import useFetch from 'hooks/useFetch';
|
||||
import history from 'lib/history';
|
||||
import React, { useCallback } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { useSelector } from 'react-redux';
|
||||
import { AppState } from 'store/reducers';
|
||||
import AppReducer from 'types/reducer/app';
|
||||
|
||||
import AlertChannelsComponent from './AlertChannels';
|
||||
import { Button, ButtonContainer } from './styles';
|
||||
import { Button, ButtonContainer, RightActionContainer } from './styles';
|
||||
|
||||
const { Paragraph } = Typography;
|
||||
|
||||
function AlertChannels(): JSX.Element {
|
||||
const { t } = useTranslation(['channels']);
|
||||
const { role } = useSelector<AppState, AppReducer>((state) => state.app);
|
||||
const [addNewChannelPermission] = useComponentPermission(
|
||||
['add_new_channel'],
|
||||
@ -34,28 +36,28 @@ function AlertChannels(): JSX.Element {
|
||||
}
|
||||
|
||||
if (loading || payload === undefined) {
|
||||
return <Spinner tip="Loading Channels.." height="90vh" />;
|
||||
return <Spinner tip={t('loading_channels_message')} height="90vh" />;
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<ButtonContainer>
|
||||
<Paragraph ellipsis type="secondary">
|
||||
The latest added channel is used as the default channel for sending alerts
|
||||
{t('sending_channels_note')}
|
||||
</Paragraph>
|
||||
|
||||
<div>
|
||||
<RightActionContainer>
|
||||
<TextToolTip
|
||||
text="More details on how to setting notification channels"
|
||||
text={t('tooltip_notification_channels')}
|
||||
url="https://signoz.io/docs/userguide/alerts-management/#setting-notification-channel"
|
||||
/>
|
||||
|
||||
{addNewChannelPermission && (
|
||||
<Button onClick={onToggleHandler} icon={<PlusOutlined />}>
|
||||
New Alert Channel
|
||||
{t('button_new_channel')}
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
</RightActionContainer>
|
||||
</ButtonContainer>
|
||||
|
||||
<AlertChannelsComponent allChannels={payload} />
|
||||
|
@ -1,6 +1,13 @@
|
||||
import { Button as ButtonComponent } from 'antd';
|
||||
import styled from 'styled-components';
|
||||
|
||||
export const RightActionContainer = styled.div`
|
||||
&&& {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
`;
|
||||
|
||||
export const ButtonContainer = styled.div`
|
||||
&&& {
|
||||
display: flex;
|
||||
|
@ -4,9 +4,12 @@ import React from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { AlertDef, Labels } from 'types/api/alerts/def';
|
||||
|
||||
import ChannelSelect from './ChannelSelect';
|
||||
import LabelSelect from './labels';
|
||||
import {
|
||||
ChannelSelectTip,
|
||||
FormContainer,
|
||||
FormItemMedium,
|
||||
InputSmall,
|
||||
SeveritySelect,
|
||||
StepHeading,
|
||||
@ -80,7 +83,7 @@ function BasicInfo({ alertDef, setAlertDef }: BasicInfoProps): JSX.Element {
|
||||
}}
|
||||
/>
|
||||
</FormItem>
|
||||
<FormItem label={t('field_labels')}>
|
||||
<FormItemMedium label={t('field_labels')}>
|
||||
<LabelSelect
|
||||
onSetLabels={(l: Labels): void => {
|
||||
setAlertDef({
|
||||
@ -92,7 +95,19 @@ function BasicInfo({ alertDef, setAlertDef }: BasicInfoProps): JSX.Element {
|
||||
}}
|
||||
initialValues={alertDef.labels}
|
||||
/>
|
||||
</FormItem>
|
||||
</FormItemMedium>
|
||||
<FormItemMedium label="Notification Channels">
|
||||
<ChannelSelect
|
||||
currentValue={alertDef.preferredChannels}
|
||||
onSelectChannels={(s: string[]): void => {
|
||||
setAlertDef({
|
||||
...alertDef,
|
||||
preferredChannels: s,
|
||||
});
|
||||
}}
|
||||
/>
|
||||
<ChannelSelectTip> {t('channel_select_tooltip')}</ChannelSelectTip>
|
||||
</FormItemMedium>
|
||||
</FormContainer>
|
||||
</>
|
||||
);
|
||||
|
@ -0,0 +1,70 @@
|
||||
import { notification, Select } from 'antd';
|
||||
import getChannels from 'api/channels/getAll';
|
||||
import useFetch from 'hooks/useFetch';
|
||||
import React from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { StyledSelect } from './styles';
|
||||
|
||||
export interface ChannelSelectProps {
|
||||
currentValue?: string[];
|
||||
onSelectChannels: (s: string[]) => void;
|
||||
}
|
||||
|
||||
function ChannelSelect({
|
||||
currentValue,
|
||||
onSelectChannels,
|
||||
}: ChannelSelectProps): JSX.Element | null {
|
||||
// init namespace for translations
|
||||
const { t } = useTranslation('alerts');
|
||||
|
||||
const { loading, payload, error, errorMessage } = useFetch(getChannels);
|
||||
|
||||
const handleChange = (value: string[]): void => {
|
||||
onSelectChannels(value);
|
||||
};
|
||||
|
||||
if (error && errorMessage !== '') {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: errorMessage,
|
||||
});
|
||||
}
|
||||
const renderOptions = (): React.ReactNode[] => {
|
||||
const children: React.ReactNode[] = [];
|
||||
|
||||
if (loading || payload === undefined || payload.length === 0) {
|
||||
return children;
|
||||
}
|
||||
|
||||
payload.forEach((o) => {
|
||||
children.push(
|
||||
<Select.Option key={o.id} value={o.name}>
|
||||
{o.name}
|
||||
</Select.Option>,
|
||||
);
|
||||
});
|
||||
|
||||
return children;
|
||||
};
|
||||
return (
|
||||
<StyledSelect
|
||||
status={error ? 'error' : ''}
|
||||
mode="multiple"
|
||||
style={{ width: '100%' }}
|
||||
placeholder={t('placeholder_channel_select')}
|
||||
value={currentValue}
|
||||
onChange={(value): void => {
|
||||
handleChange(value as string[]);
|
||||
}}
|
||||
optionLabelProp="label"
|
||||
>
|
||||
{renderOptions()}
|
||||
</StyledSelect>
|
||||
);
|
||||
}
|
||||
|
||||
ChannelSelect.defaultProps = {
|
||||
currentValue: [],
|
||||
};
|
||||
export default ChannelSelect;
|
@ -0,0 +1,6 @@
|
||||
import { Select } from 'antd';
|
||||
import styled from 'styled-components';
|
||||
|
||||
export const StyledSelect = styled(Select)`
|
||||
border-radius: 4px;
|
||||
`;
|
@ -21,7 +21,7 @@ export interface ChartPreviewProps {
|
||||
selectedTime?: timePreferenceType;
|
||||
selectedInterval?: Time;
|
||||
headline?: JSX.Element;
|
||||
threshold?: number;
|
||||
threshold?: number | undefined;
|
||||
}
|
||||
|
||||
function ChartPreview({
|
||||
@ -35,7 +35,7 @@ function ChartPreview({
|
||||
}: ChartPreviewProps): JSX.Element | null {
|
||||
const { t } = useTranslation('alerts');
|
||||
const staticLine: StaticLineProps | undefined =
|
||||
threshold && threshold > 0
|
||||
threshold !== undefined
|
||||
? {
|
||||
yMin: threshold,
|
||||
yMax: threshold,
|
||||
@ -66,8 +66,12 @@ function ChartPreview({
|
||||
}),
|
||||
enabled:
|
||||
query != null &&
|
||||
(query.queryType !== EQueryType.PROM ||
|
||||
(query.promQL?.length > 0 && query.promQL[0].query !== '')),
|
||||
((query.queryType === EQueryType.PROM &&
|
||||
query.promQL?.length > 0 &&
|
||||
query.promQL[0].query !== '') ||
|
||||
(query.queryType === EQueryType.QUERY_BUILDER &&
|
||||
query.metricsBuilder?.queryBuilder?.length > 0 &&
|
||||
query.metricsBuilder?.queryBuilder[0].metricName !== '')),
|
||||
});
|
||||
|
||||
const chartDataSet = queryResponse.isError
|
||||
@ -113,7 +117,7 @@ ChartPreview.defaultProps = {
|
||||
selectedTime: 'GLOBAL_TIME',
|
||||
selectedInterval: '5min',
|
||||
headline: undefined,
|
||||
threshold: 0,
|
||||
threshold: undefined,
|
||||
};
|
||||
|
||||
export default ChartPreview;
|
||||
|
@ -156,7 +156,9 @@ function RuleOptions({
|
||||
...alertDef,
|
||||
condition: {
|
||||
...alertDef.condition,
|
||||
target: (value as number) || undefined,
|
||||
op: alertDef.condition?.op || defaultCompareOp,
|
||||
matchType: alertDef.condition?.matchType || defaultMatchType,
|
||||
target: value as number,
|
||||
},
|
||||
});
|
||||
}}
|
||||
|
@ -1,6 +1,7 @@
|
||||
import { ExclamationCircleOutlined, SaveOutlined } from '@ant-design/icons';
|
||||
import { FormInstance, Modal, notification, Typography } from 'antd';
|
||||
import saveAlertApi from 'api/alerts/save';
|
||||
import testAlertApi from 'api/alerts/testAlert';
|
||||
import ROUTES from 'constants/routes';
|
||||
import QueryTypeTag from 'container/NewWidget/LeftContainer/QueryTypeTag';
|
||||
import PlotTag from 'container/NewWidget/LeftContainer/WidgetGraph/PlotTag';
|
||||
@ -83,7 +84,7 @@ function FormAlertRules({
|
||||
|
||||
// staged query is used to display chart preview
|
||||
const [stagedQuery, setStagedQuery] = useState<StagedQuery>();
|
||||
const debouncedStagedQuery = useDebounce(stagedQuery, 500);
|
||||
const debouncedStagedQuery = useDebounce(stagedQuery, 1000);
|
||||
|
||||
// this use effect initiates staged query and
|
||||
// other queries based on server data.
|
||||
@ -143,10 +144,74 @@ function FormAlertRules({
|
||||
});
|
||||
}
|
||||
};
|
||||
const validatePromParams = useCallback((): boolean => {
|
||||
let retval = true;
|
||||
if (queryCategory !== EQueryType.PROM) return retval;
|
||||
|
||||
if (!promQueries || Object.keys(promQueries).length === 0) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('promql_required'),
|
||||
});
|
||||
return false;
|
||||
}
|
||||
|
||||
Object.keys(promQueries).forEach((key) => {
|
||||
if (promQueries[key].query === '') {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('promql_required'),
|
||||
});
|
||||
retval = false;
|
||||
}
|
||||
});
|
||||
|
||||
return retval;
|
||||
}, [t, promQueries, queryCategory]);
|
||||
|
||||
const validateQBParams = useCallback((): boolean => {
|
||||
let retval = true;
|
||||
if (queryCategory !== EQueryType.QUERY_BUILDER) return true;
|
||||
|
||||
if (!metricQueries || Object.keys(metricQueries).length === 0) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('condition_required'),
|
||||
});
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!alertDef.condition?.target) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('target_missing'),
|
||||
});
|
||||
return false;
|
||||
}
|
||||
|
||||
Object.keys(metricQueries).forEach((key) => {
|
||||
if (metricQueries[key].metricName === '') {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('metricname_missing', { where: metricQueries[key].name }),
|
||||
});
|
||||
retval = false;
|
||||
}
|
||||
});
|
||||
|
||||
Object.keys(formulaQueries).forEach((key) => {
|
||||
if (formulaQueries[key].expression === '') {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('expression_missing', formulaQueries[key].name),
|
||||
});
|
||||
retval = false;
|
||||
}
|
||||
});
|
||||
return retval;
|
||||
}, [t, alertDef, queryCategory, metricQueries, formulaQueries]);
|
||||
|
||||
const isFormValid = useCallback((): boolean => {
|
||||
let retval = true;
|
||||
|
||||
if (!alertDef.alert || alertDef.alert === '') {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
@ -155,56 +220,14 @@ function FormAlertRules({
|
||||
return false;
|
||||
}
|
||||
|
||||
if (
|
||||
queryCategory === EQueryType.PROM &&
|
||||
(!promQueries || Object.keys(promQueries).length === 0)
|
||||
) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('promql_required'),
|
||||
});
|
||||
if (!validatePromParams()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (
|
||||
(queryCategory === EQueryType.QUERY_BUILDER && !metricQueries) ||
|
||||
Object.keys(metricQueries).length === 0
|
||||
) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('condition_required'),
|
||||
});
|
||||
return false;
|
||||
}
|
||||
|
||||
Object.keys(metricQueries).forEach((key) => {
|
||||
if (metricQueries[key].metricName === '') {
|
||||
retval = false;
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('metricname_missing', { where: metricQueries[key].name }),
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
Object.keys(formulaQueries).forEach((key) => {
|
||||
if (formulaQueries[key].expression === '') {
|
||||
retval = false;
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('expression_missing', formulaQueries[key].name),
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return retval;
|
||||
}, [t, alertDef, queryCategory, metricQueries, formulaQueries, promQueries]);
|
||||
|
||||
const saveRule = useCallback(async () => {
|
||||
if (!isFormValid()) {
|
||||
return;
|
||||
}
|
||||
return validateQBParams();
|
||||
}, [t, validateQBParams, alertDef, validatePromParams]);
|
||||
|
||||
const preparePostData = (): AlertDef => {
|
||||
const postableAlert: AlertDef = {
|
||||
...alertDef,
|
||||
source: window?.location.toString(),
|
||||
@ -219,6 +242,22 @@ function FormAlertRules({
|
||||
},
|
||||
},
|
||||
};
|
||||
return postableAlert;
|
||||
};
|
||||
|
||||
const memoizedPreparePostData = useCallback(preparePostData, [
|
||||
queryCategory,
|
||||
alertDef,
|
||||
metricQueries,
|
||||
formulaQueries,
|
||||
promQueries,
|
||||
]);
|
||||
|
||||
const saveRule = useCallback(async () => {
|
||||
if (!isFormValid()) {
|
||||
return;
|
||||
}
|
||||
const postableAlert = memoizedPreparePostData();
|
||||
|
||||
setLoading(true);
|
||||
try {
|
||||
@ -235,7 +274,7 @@ function FormAlertRules({
|
||||
description:
|
||||
!ruleId || ruleId === 0 ? t('rule_created') : t('rule_edited'),
|
||||
});
|
||||
console.log('invalidting cache');
|
||||
|
||||
// invalidate rule in cache
|
||||
ruleCache.invalidateQueries(['ruleId', ruleId]);
|
||||
|
||||
@ -249,24 +288,13 @@ function FormAlertRules({
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
console.log('save alert api failed:', e);
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('unexpected_error'),
|
||||
});
|
||||
}
|
||||
setLoading(false);
|
||||
}, [
|
||||
t,
|
||||
isFormValid,
|
||||
queryCategory,
|
||||
ruleId,
|
||||
alertDef,
|
||||
metricQueries,
|
||||
formulaQueries,
|
||||
promQueries,
|
||||
ruleCache,
|
||||
]);
|
||||
}, [t, isFormValid, ruleId, ruleCache, memoizedPreparePostData]);
|
||||
|
||||
const onSaveHandler = useCallback(async () => {
|
||||
const content = (
|
||||
@ -287,6 +315,44 @@ function FormAlertRules({
|
||||
});
|
||||
}, [t, saveRule, queryCategory]);
|
||||
|
||||
const onTestRuleHandler = useCallback(async () => {
|
||||
if (!isFormValid()) {
|
||||
return;
|
||||
}
|
||||
const postableAlert = memoizedPreparePostData();
|
||||
|
||||
setLoading(true);
|
||||
try {
|
||||
const response = await testAlertApi({ data: postableAlert });
|
||||
|
||||
if (response.statusCode === 200) {
|
||||
const { payload } = response;
|
||||
if (payload?.alertCount === 0) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('no_alerts_found'),
|
||||
});
|
||||
} else {
|
||||
notification.success({
|
||||
message: 'Success',
|
||||
description: t('rule_test_fired'),
|
||||
});
|
||||
}
|
||||
} else {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: response.error || t('unexpected_error'),
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
notification.error({
|
||||
message: 'Error',
|
||||
description: t('unexpected_error'),
|
||||
});
|
||||
}
|
||||
setLoading(false);
|
||||
}, [t, isFormValid, memoizedPreparePostData]);
|
||||
|
||||
const renderBasicInfo = (): JSX.Element => (
|
||||
<BasicInfo alertDef={alertDef} setAlertDef={setAlertDef} />
|
||||
);
|
||||
@ -353,6 +419,14 @@ function FormAlertRules({
|
||||
>
|
||||
{ruleId > 0 ? t('button_savechanges') : t('button_createrule')}
|
||||
</ActionButton>
|
||||
<ActionButton
|
||||
loading={loading || false}
|
||||
type="default"
|
||||
onClick={onTestRuleHandler}
|
||||
>
|
||||
{' '}
|
||||
{t('button_testrule')}
|
||||
</ActionButton>
|
||||
<ActionButton
|
||||
disabled={loading || false}
|
||||
type="default"
|
||||
|
@ -8,8 +8,7 @@ interface SearchContainerProps {
|
||||
}
|
||||
|
||||
export const SearchContainer = styled.div<SearchContainerProps>`
|
||||
width: 70%;
|
||||
border-radisu: 4px;
|
||||
border-radius: 4px;
|
||||
background: ${({ isDarkMode }): string => (isDarkMode ? '#000' : '#fff')};
|
||||
flex: 1;
|
||||
display: flex;
|
||||
|
@ -1,4 +1,15 @@
|
||||
import { Button, Card, Col, Form, Input, InputNumber, Row, Select } from 'antd';
|
||||
import {
|
||||
Button,
|
||||
Card,
|
||||
Col,
|
||||
Form,
|
||||
Input,
|
||||
InputNumber,
|
||||
Row,
|
||||
Select,
|
||||
Typography,
|
||||
} from 'antd';
|
||||
import FormItem from 'antd/lib/form/FormItem';
|
||||
import TextArea from 'antd/lib/input/TextArea';
|
||||
import styled from 'styled-components';
|
||||
|
||||
@ -67,21 +78,19 @@ export const InlineSelect = styled(Select)`
|
||||
`;
|
||||
|
||||
export const SeveritySelect = styled(Select)`
|
||||
width: 15% !important;
|
||||
width: 25% !important;
|
||||
`;
|
||||
|
||||
export const InputSmall = styled(Input)`
|
||||
width: 40% !important;
|
||||
`;
|
||||
|
||||
export const FormContainer = styled.div`
|
||||
export const FormContainer = styled(Card)`
|
||||
padding: 2em;
|
||||
margin-top: 1rem;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
background: #141414;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #303030;
|
||||
`;
|
||||
|
||||
export const ThresholdInput = styled(InputNumber)`
|
||||
@ -101,3 +110,11 @@ export const ThresholdInput = styled(InputNumber)`
|
||||
export const TextareaMedium = styled(TextArea)`
|
||||
width: 70%;
|
||||
`;
|
||||
|
||||
export const FormItemMedium = styled(FormItem)`
|
||||
width: 70%;
|
||||
`;
|
||||
|
||||
export const ChannelSelectTip = styled(Typography.Text)`
|
||||
color: hsla(0, 0%, 100%, 0.3);
|
||||
`;
|
||||
|
@ -1,10 +1,11 @@
|
||||
import { Button } from 'antd';
|
||||
import { NotificationInstance } from 'antd/lib/notification/index';
|
||||
import deleteAlerts from 'api/alerts/delete';
|
||||
import { State } from 'hooks/useFetch';
|
||||
import React, { useState } from 'react';
|
||||
import { PayloadProps as DeleteAlertPayloadProps } from 'types/api/alerts/delete';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { GettableAlert } from 'types/api/alerts/get';
|
||||
|
||||
import { ColumnButton } from './styles';
|
||||
|
||||
function DeleteAlert({
|
||||
id,
|
||||
@ -72,20 +73,20 @@ function DeleteAlert({
|
||||
};
|
||||
|
||||
return (
|
||||
<Button
|
||||
<ColumnButton
|
||||
disabled={deleteAlertState.loading || false}
|
||||
loading={deleteAlertState.loading || false}
|
||||
onClick={(): Promise<void> => onDeleteHandler(id)}
|
||||
type="link"
|
||||
>
|
||||
Delete
|
||||
</Button>
|
||||
</ColumnButton>
|
||||
);
|
||||
}
|
||||
|
||||
interface DeleteAlertProps {
|
||||
id: Alerts['id'];
|
||||
setData: React.Dispatch<React.SetStateAction<Alerts[]>>;
|
||||
id: GettableAlert['id'];
|
||||
setData: React.Dispatch<React.SetStateAction<GettableAlert[]>>;
|
||||
notifications: NotificationInstance;
|
||||
}
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
/* eslint-disable react/display-name */
|
||||
import { PlusOutlined } from '@ant-design/icons';
|
||||
import { notification, Tag, Typography } from 'antd';
|
||||
import { notification, Typography } from 'antd';
|
||||
import Table, { ColumnsType } from 'antd/lib/table';
|
||||
import TextToolTip from 'components/TextToolTip';
|
||||
import ROUTES from 'constants/routes';
|
||||
@ -13,15 +13,16 @@ import { UseQueryResult } from 'react-query';
|
||||
import { useSelector } from 'react-redux';
|
||||
import { AppState } from 'store/reducers';
|
||||
import { ErrorResponse, SuccessResponse } from 'types/api';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { GettableAlert } from 'types/api/alerts/get';
|
||||
import AppReducer from 'types/reducer/app';
|
||||
|
||||
import DeleteAlert from './DeleteAlert';
|
||||
import { Button, ButtonContainer } from './styles';
|
||||
import { Button, ButtonContainer, ColumnButton, StyledTag } from './styles';
|
||||
import Status from './TableComponents/Status';
|
||||
import ToggleAlertState from './ToggleAlertState';
|
||||
|
||||
function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
const [data, setData] = useState<Alerts[]>(allAlertRules || []);
|
||||
const [data, setData] = useState<GettableAlert[]>(allAlertRules || []);
|
||||
const { t } = useTranslation('common');
|
||||
const { role } = useSelector<AppState, AppReducer>((state) => state.app);
|
||||
const [addNewAlert, action] = useComponentPermission(
|
||||
@ -53,22 +54,27 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
history.push(`${ROUTES.EDIT_ALERTS}?ruleId=${id}`);
|
||||
};
|
||||
|
||||
const columns: ColumnsType<Alerts> = [
|
||||
const columns: ColumnsType<GettableAlert> = [
|
||||
{
|
||||
title: 'Status',
|
||||
dataIndex: 'state',
|
||||
key: 'state',
|
||||
sorter: (a, b): number =>
|
||||
b.labels.severity.length - a.labels.severity.length,
|
||||
(b.state ? b.state.charCodeAt(0) : 1000) -
|
||||
(a.state ? a.state.charCodeAt(0) : 1000),
|
||||
render: (value): JSX.Element => <Status status={value} />,
|
||||
},
|
||||
{
|
||||
title: 'Alert Name',
|
||||
dataIndex: 'alert',
|
||||
key: 'name',
|
||||
sorter: (a, b): number => a.name.charCodeAt(0) - b.name.charCodeAt(0),
|
||||
sorter: (a, b): number =>
|
||||
(a.alert ? a.alert.charCodeAt(0) : 1000) -
|
||||
(b.alert ? b.alert.charCodeAt(0) : 1000),
|
||||
render: (value, record): JSX.Element => (
|
||||
<Typography.Link onClick={(): void => onEditHandler(record.id.toString())}>
|
||||
<Typography.Link
|
||||
onClick={(): void => onEditHandler(record.id ? record.id.toString() : '')}
|
||||
>
|
||||
{value}
|
||||
</Typography.Link>
|
||||
),
|
||||
@ -78,7 +84,8 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
dataIndex: 'labels',
|
||||
key: 'severity',
|
||||
sorter: (a, b): number =>
|
||||
a.labels.severity.length - b.labels.severity.length,
|
||||
(a.labels ? a.labels.severity.length : 0) -
|
||||
(b.labels ? b.labels.severity.length : 0),
|
||||
render: (value): JSX.Element => {
|
||||
const objectKeys = Object.keys(value);
|
||||
const withSeverityKey = objectKeys.find((e) => e === 'severity') || '';
|
||||
@ -92,6 +99,7 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
dataIndex: 'labels',
|
||||
key: 'tags',
|
||||
align: 'center',
|
||||
width: 350,
|
||||
render: (value): JSX.Element => {
|
||||
const objectKeys = Object.keys(value);
|
||||
const withOutSeverityKeys = objectKeys.filter((e) => e !== 'severity');
|
||||
@ -104,9 +112,9 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
<>
|
||||
{withOutSeverityKeys.map((e) => {
|
||||
return (
|
||||
<Tag key={e} color="magenta">
|
||||
<StyledTag key={e} color="magenta">
|
||||
{e}: {value[e]}
|
||||
</Tag>
|
||||
</StyledTag>
|
||||
);
|
||||
})}
|
||||
</>
|
||||
@ -120,14 +128,19 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
title: 'Action',
|
||||
dataIndex: 'id',
|
||||
key: 'action',
|
||||
render: (id: Alerts['id']): JSX.Element => {
|
||||
render: (id: GettableAlert['id'], record): JSX.Element => {
|
||||
return (
|
||||
<>
|
||||
<DeleteAlert notifications={notifications} setData={setData} id={id} />
|
||||
<ToggleAlertState disabled={record.disabled} setData={setData} id={id} />
|
||||
|
||||
<Button onClick={(): void => onEditHandler(id.toString())} type="link">
|
||||
<ColumnButton
|
||||
onClick={(): void => onEditHandler(id.toString())}
|
||||
type="link"
|
||||
>
|
||||
Edit
|
||||
</Button>
|
||||
</ColumnButton>
|
||||
|
||||
<DeleteAlert notifications={notifications} setData={setData} id={id} />
|
||||
</>
|
||||
);
|
||||
},
|
||||
@ -159,8 +172,10 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
|
||||
}
|
||||
|
||||
interface ListAlertProps {
|
||||
allAlertRules: Alerts[];
|
||||
refetch: UseQueryResult<ErrorResponse | SuccessResponse<Alerts[]>>['refetch'];
|
||||
allAlertRules: GettableAlert[];
|
||||
refetch: UseQueryResult<
|
||||
ErrorResponse | SuccessResponse<GettableAlert[]>
|
||||
>['refetch'];
|
||||
}
|
||||
|
||||
export default ListAlert;
|
||||
|
@ -1,6 +1,6 @@
|
||||
import { Tag } from 'antd';
|
||||
import React from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { GettableAlert } from 'types/api/alerts/get';
|
||||
|
||||
function Status({ status }: StatusProps): JSX.Element {
|
||||
switch (status) {
|
||||
@ -16,14 +16,18 @@ function Status({ status }: StatusProps): JSX.Element {
|
||||
return <Tag color="red">Firing</Tag>;
|
||||
}
|
||||
|
||||
case 'disabled': {
|
||||
return <Tag>Disabled</Tag>;
|
||||
}
|
||||
|
||||
default: {
|
||||
return <Tag color="default">Unknown Status</Tag>;
|
||||
return <Tag color="default">Unknown</Tag>;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
interface StatusProps {
|
||||
status: Alerts['state'];
|
||||
status: GettableAlert['state'];
|
||||
}
|
||||
|
||||
export default Status;
|
||||
|
108
frontend/src/container/ListAlertRules/ToggleAlertState.tsx
Normal file
@ -0,0 +1,108 @@
|
||||
import { notification } from 'antd';
|
||||
import patchAlert from 'api/alerts/patch';
|
||||
import { State } from 'hooks/useFetch';
|
||||
import React, { useState } from 'react';
|
||||
import { GettableAlert } from 'types/api/alerts/get';
|
||||
import { PayloadProps as PatchPayloadProps } from 'types/api/alerts/patch';
|
||||
|
||||
import { ColumnButton } from './styles';
|
||||
|
||||
function ToggleAlertState({
|
||||
id,
|
||||
disabled,
|
||||
setData,
|
||||
}: ToggleAlertStateProps): JSX.Element {
|
||||
const [apiStatus, setAPIStatus] = useState<State<PatchPayloadProps>>({
|
||||
error: false,
|
||||
errorMessage: '',
|
||||
loading: false,
|
||||
success: false,
|
||||
payload: undefined,
|
||||
});
|
||||
|
||||
const defaultErrorMessage = 'Something went wrong';
|
||||
|
||||
const onToggleHandler = async (
|
||||
id: number,
|
||||
disabled: boolean,
|
||||
): Promise<void> => {
|
||||
try {
|
||||
setAPIStatus((state) => ({
|
||||
...state,
|
||||
loading: true,
|
||||
}));
|
||||
|
||||
const response = await patchAlert({
|
||||
id,
|
||||
data: {
|
||||
disabled,
|
||||
},
|
||||
});
|
||||
|
||||
if (response.statusCode === 200) {
|
||||
setData((state) => {
|
||||
return state.map((alert) => {
|
||||
if (alert.id === id) {
|
||||
return {
|
||||
...alert,
|
||||
disabled: response.payload.disabled,
|
||||
state: response.payload.state,
|
||||
};
|
||||
}
|
||||
return alert;
|
||||
});
|
||||
});
|
||||
|
||||
setAPIStatus((state) => ({
|
||||
...state,
|
||||
loading: false,
|
||||
payload: response.payload,
|
||||
}));
|
||||
notification.success({
|
||||
message: 'Success',
|
||||
});
|
||||
} else {
|
||||
setAPIStatus((state) => ({
|
||||
...state,
|
||||
loading: false,
|
||||
error: true,
|
||||
errorMessage: response.error || defaultErrorMessage,
|
||||
}));
|
||||
|
||||
notification.error({
|
||||
message: response.error || defaultErrorMessage,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
setAPIStatus((state) => ({
|
||||
...state,
|
||||
loading: false,
|
||||
error: true,
|
||||
errorMessage: defaultErrorMessage,
|
||||
}));
|
||||
|
||||
notification.error({
|
||||
message: defaultErrorMessage,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ColumnButton
|
||||
disabled={apiStatus.loading || false}
|
||||
loading={apiStatus.loading || false}
|
||||
onClick={(): Promise<void> => onToggleHandler(id, !disabled)}
|
||||
type="link"
|
||||
>
|
||||
{disabled ? 'Enable' : 'Disable'}
|
||||
</ColumnButton>
|
||||
);
|
||||
}
|
||||
|
||||
interface ToggleAlertStateProps {
|
||||
id: GettableAlert['id'];
|
||||
disabled: boolean;
|
||||
setData: React.Dispatch<React.SetStateAction<GettableAlert[]>>;
|
||||
}
|
||||
|
||||
export default ToggleAlertState;
|
@ -1,4 +1,4 @@
|
||||
import { Button as ButtonComponent } from 'antd';
|
||||
import { Button as ButtonComponent, Tag } from 'antd';
|
||||
import styled from 'styled-components';
|
||||
|
||||
export const ButtonContainer = styled.div`
|
||||
@ -12,6 +12,20 @@ export const ButtonContainer = styled.div`
|
||||
|
||||
export const Button = styled(ButtonComponent)`
|
||||
&&& {
|
||||
margin-left: 1rem;
|
||||
margin-left: 1em;
|
||||
}
|
||||
`;
|
||||
|
||||
export const ColumnButton = styled(ButtonComponent)`
|
||||
&&& {
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
margin-right: 1.5em;
|
||||
}
|
||||
`;
|
||||
|
||||
export const StyledTag = styled(Tag)`
|
||||
&&& {
|
||||
white-space: normal;
|
||||
}
|
||||
`;
|
||||
|
@ -15,7 +15,7 @@ import { PromQLWidgets } from 'types/api/dashboard/getAll';
|
||||
import MetricReducer from 'types/reducer/metrics';
|
||||
|
||||
import { Card, Col, GraphContainer, GraphTitle, Row } from '../styles';
|
||||
import TopEndpointsTable from '../TopEndpointsTable';
|
||||
import TopOperationsTable from '../TopOperationsTable';
|
||||
import { Button } from './styles';
|
||||
|
||||
function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
@ -23,11 +23,13 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
const selectedTimeStamp = useRef(0);
|
||||
|
||||
const {
|
||||
topEndPoints,
|
||||
topOperations,
|
||||
serviceOverview,
|
||||
resourceAttributePromQLQuery,
|
||||
resourceAttributeQueries,
|
||||
topLevelOperations,
|
||||
} = useSelector<AppState, MetricReducer>((state) => state.metrics);
|
||||
const operationsRegex = topLevelOperations.join('|');
|
||||
|
||||
const selectedTraceTags: string = JSON.stringify(
|
||||
convertRawQueriesToTraceSelectedTags(resourceAttributeQueries, 'array') || [],
|
||||
@ -107,7 +109,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
<Button
|
||||
type="default"
|
||||
size="small"
|
||||
id="Application_button"
|
||||
id="Service_button"
|
||||
onClick={(): void => {
|
||||
onTracePopupClick(selectedTimeStamp.current);
|
||||
}}
|
||||
@ -115,13 +117,13 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
View Traces
|
||||
</Button>
|
||||
<Card>
|
||||
<GraphTitle>Application latency</GraphTitle>
|
||||
<GraphTitle>Latency</GraphTitle>
|
||||
<GraphContainer>
|
||||
<Graph
|
||||
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
|
||||
onClickHandler(ChartEvent, activeElements, chart, data, 'Application');
|
||||
onClickHandler(ChartEvent, activeElements, chart, data, 'Service');
|
||||
}}
|
||||
name="application_latency"
|
||||
name="service_latency"
|
||||
type="line"
|
||||
data={{
|
||||
datasets: [
|
||||
@ -175,7 +177,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
<Button
|
||||
type="default"
|
||||
size="small"
|
||||
id="Request_button"
|
||||
id="Rate_button"
|
||||
onClick={(): void => {
|
||||
onTracePopupClick(selectedTimeStamp.current);
|
||||
}}
|
||||
@ -183,21 +185,21 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
View Traces
|
||||
</Button>
|
||||
<Card>
|
||||
<GraphTitle>Requests</GraphTitle>
|
||||
<GraphTitle>Rate (ops/s)</GraphTitle>
|
||||
<GraphContainer>
|
||||
<FullView
|
||||
name="request_per_sec"
|
||||
name="operations_per_sec"
|
||||
fullViewOptions={false}
|
||||
onClickHandler={(event, element, chart, data): void => {
|
||||
onClickHandler(event, element, chart, data, 'Request');
|
||||
onClickHandler(event, element, chart, data, 'Rate');
|
||||
}}
|
||||
widget={getWidget([
|
||||
{
|
||||
query: `sum(rate(signoz_latency_count{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))`,
|
||||
legend: 'Requests',
|
||||
query: `sum(rate(signoz_latency_count{service_name="${servicename}", operation=~"${operationsRegex}"${resourceAttributePromQLQuery}}[5m]))`,
|
||||
legend: 'Operations',
|
||||
},
|
||||
])}
|
||||
yAxisUnit="reqps"
|
||||
yAxisUnit="ops"
|
||||
/>
|
||||
</GraphContainer>
|
||||
</Card>
|
||||
@ -227,7 +229,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
}}
|
||||
widget={getWidget([
|
||||
{
|
||||
query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", span_kind="SPAN_KIND_SERVER"${resourceAttributePromQLQuery}}[5m]))) < 1000 OR vector(0)`,
|
||||
query: `max(sum(rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}", status_code="STATUS_CODE_ERROR"${resourceAttributePromQLQuery}}[5m]) OR rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}", http_status_code=~"5.."${resourceAttributePromQLQuery}}[5m]))*100/sum(rate(signoz_calls_total{service_name="${servicename}", operation=~"${operationsRegex}"${resourceAttributePromQLQuery}}[5m]))) < 1000 OR vector(0)`,
|
||||
legend: 'Error Percentage',
|
||||
},
|
||||
])}
|
||||
@ -239,7 +241,7 @@ function Application({ getWidget }: DashboardProps): JSX.Element {
|
||||
|
||||
<Col span={12}>
|
||||
<Card>
|
||||
<TopEndpointsTable data={topEndPoints} />
|
||||
<TopOperationsTable data={topOperations} />
|
||||
</Card>
|
||||
</Col>
|
||||
</Row>
|
||||
|
@ -11,7 +11,7 @@ import { AppState } from 'store/reducers';
|
||||
import { GlobalReducer } from 'types/reducer/globalTime';
|
||||
import MetricReducer from 'types/reducer/metrics';
|
||||
|
||||
function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
|
||||
function TopOperationsTable(props: TopOperationsTableProps): JSX.Element {
|
||||
const { minTime, maxTime } = useSelector<AppState, GlobalReducer>(
|
||||
(state) => state.globalTime,
|
||||
);
|
||||
@ -85,7 +85,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
|
||||
title: 'Number of Calls',
|
||||
dataIndex: 'numCalls',
|
||||
key: 'numCalls',
|
||||
sorter: (a: TopEndpointListItem, b: TopEndpointListItem): number =>
|
||||
sorter: (a: TopOperationListItem, b: TopOperationListItem): number =>
|
||||
a.numCalls - b.numCalls,
|
||||
},
|
||||
];
|
||||
@ -94,7 +94,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
|
||||
<Table
|
||||
showHeader
|
||||
title={(): string => {
|
||||
return 'Top Endpoints';
|
||||
return 'Key Operations';
|
||||
}}
|
||||
tableLayout="fixed"
|
||||
dataSource={data}
|
||||
@ -104,7 +104,7 @@ function TopEndpointsTable(props: TopEndpointsTableProps): JSX.Element {
|
||||
);
|
||||
}
|
||||
|
||||
interface TopEndpointListItem {
|
||||
interface TopOperationListItem {
|
||||
p50: number;
|
||||
p95: number;
|
||||
p99: number;
|
||||
@ -112,10 +112,10 @@ interface TopEndpointListItem {
|
||||
name: string;
|
||||
}
|
||||
|
||||
type DataProps = TopEndpointListItem;
|
||||
type DataProps = TopOperationListItem;
|
||||
|
||||
interface TopEndpointsTableProps {
|
||||
data: TopEndpointListItem[];
|
||||
interface TopOperationsTableProps {
|
||||
data: TopOperationListItem[];
|
||||
}
|
||||
|
||||
export default TopEndpointsTable;
|
||||
export default TopOperationsTable;
|
@ -56,14 +56,14 @@ function Metrics(): JSX.Element {
|
||||
render: (value: number): string => (value / 1000000).toFixed(2),
|
||||
},
|
||||
{
|
||||
title: 'Error Rate (% of requests)',
|
||||
title: 'Error Rate (% of total)',
|
||||
dataIndex: 'errorRate',
|
||||
key: 'errorRate',
|
||||
sorter: (a: DataProps, b: DataProps): number => a.errorRate - b.errorRate,
|
||||
render: (value: number): string => value.toFixed(2),
|
||||
},
|
||||
{
|
||||
title: 'Requests Per Second',
|
||||
title: 'Operations Per Second',
|
||||
dataIndex: 'callRate',
|
||||
key: 'callRate',
|
||||
sorter: (a: DataProps, b: DataProps): number => a.callRate - b.callRate,
|
||||
|
@ -42,8 +42,9 @@ export interface Option {
|
||||
}
|
||||
|
||||
export const ServiceMapOptions: Option[] = [
|
||||
{ value: '1min', label: 'Last 1 min' },
|
||||
{ value: '5min', label: 'Last 5 min' },
|
||||
{ value: '15min', label: 'Last 15 min' },
|
||||
{ value: '30min', label: 'Last 30 min' },
|
||||
];
|
||||
|
||||
export const getDefaultOption = (route: string): Time => {
|
||||
|
@ -2,7 +2,7 @@
|
||||
import type { SelectProps } from 'antd';
|
||||
import { Tag } from 'antd';
|
||||
import React, { useCallback, useMemo } from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import { Container, Select } from './styles';
|
||||
|
||||
|
@ -2,7 +2,7 @@ import { Tag, Typography } from 'antd';
|
||||
import convertDateToAmAndPm from 'lib/convertDateToAmAndPm';
|
||||
import getFormattedDate from 'lib/getFormatedDate';
|
||||
import React from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import Status from '../TableComponents/AlertStatus';
|
||||
import { TableCell, TableRow } from './styles';
|
||||
|
@ -1,7 +1,7 @@
|
||||
import { MinusSquareOutlined, PlusSquareOutlined } from '@ant-design/icons';
|
||||
import { Tag } from 'antd';
|
||||
import React, { useState } from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import ExapandableRow from './ExapandableRow';
|
||||
import { IconContainer, StatusContainer, TableCell, TableRow } from './styles';
|
||||
|
@ -1,6 +1,6 @@
|
||||
import groupBy from 'lodash-es/groupBy';
|
||||
import React, { useMemo } from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import { Value } from '../Filter';
|
||||
import { FilterAlerts } from '../utils';
|
||||
|
@ -5,7 +5,7 @@ import AlertStatus from 'container/TriggeredAlerts/TableComponents/AlertStatus';
|
||||
import convertDateToAmAndPm from 'lib/convertDateToAmAndPm';
|
||||
import getFormattedDate from 'lib/getFormatedDate';
|
||||
import React from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import { Value } from './Filter';
|
||||
import { FilterAlerts } from './utils';
|
||||
|
@ -1,7 +1,7 @@
|
||||
import getTriggeredApi from 'api/alerts/getTriggered';
|
||||
import useInterval from 'hooks/useInterval';
|
||||
import React, { useState } from 'react';
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import Filter, { Value } from './Filter';
|
||||
import FilteredTable from './FilteredTable';
|
||||
|
@ -1,4 +1,4 @@
|
||||
import { Alerts } from 'types/api/alerts/getAll';
|
||||
import { Alerts } from 'types/api/alerts/getTriggered';
|
||||
|
||||
import { Value } from './Filter';
|
||||
|
||||
|
@ -45,6 +45,9 @@ interface graphLink {
|
||||
source: string;
|
||||
target: string;
|
||||
value: number;
|
||||
callRate: number;
|
||||
errorRate: number;
|
||||
p99: number;
|
||||
}
|
||||
export interface graphDataType {
|
||||
nodes: graphNode[];
|
||||
@ -96,16 +99,16 @@ function ServiceMap(props: ServiceMapProps): JSX.Element {
|
||||
const graphData = { nodes, links };
|
||||
return (
|
||||
<Container>
|
||||
<SelectService
|
||||
services={serviceMap.services}
|
||||
{/* <SelectService
|
||||
services={serviceMap.items}
|
||||
zoomToService={zoomToService}
|
||||
zoomToDefault={zoomToDefault}
|
||||
/>
|
||||
/> */}
|
||||
<ForceGraph2D
|
||||
ref={fgRef}
|
||||
cooldownTicks={100}
|
||||
graphData={graphData}
|
||||
nodeLabel={getTooltip}
|
||||
linkLabel={getTooltip}
|
||||
linkAutoColorBy={(d) => d.target}
|
||||
linkDirectionalParticles="value"
|
||||
linkDirectionalParticleSpeed={(d) => d.value}
|
||||
@ -124,7 +127,7 @@ function ServiceMap(props: ServiceMapProps): JSX.Element {
|
||||
ctx.fillStyle = isDarkMode ? '#ffffff' : '#000000';
|
||||
ctx.fillText(label, node.x, node.y);
|
||||
}}
|
||||
onNodeClick={(node) => {
|
||||
onLinkHover={(node) => {
|
||||
const tooltip = document.querySelector('.graph-tooltip');
|
||||
if (tooltip && node) {
|
||||
tooltip.innerHTML = getTooltip(node);
|
||||
|
@ -1,12 +1,13 @@
|
||||
/*eslint-disable*/
|
||||
//@ts-nocheck
|
||||
|
||||
import { cloneDeep, find, maxBy, uniq, uniqBy } from 'lodash-es';
|
||||
import { cloneDeep, find, maxBy, uniq, uniqBy, groupBy, sumBy } from 'lodash-es';
|
||||
import { graphDataType } from './ServiceMap';
|
||||
|
||||
const MIN_WIDTH = 10;
|
||||
const MAX_WIDTH = 20;
|
||||
const DEFAULT_FONT_SIZE = 6;
|
||||
|
||||
export const getDimensions = (num, highest) => {
|
||||
const percentage = (num / highest) * 100;
|
||||
const width = (percentage * (MAX_WIDTH - MIN_WIDTH)) / 100 + MIN_WIDTH;
|
||||
@ -18,19 +19,30 @@ export const getDimensions = (num, highest) => {
|
||||
};
|
||||
|
||||
export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
|
||||
const { items, services } = serviceMap;
|
||||
const { items } = serviceMap;
|
||||
const services = Object.values(groupBy(items, 'child')).map((e) => {
|
||||
return {
|
||||
serviceName: e[0].child,
|
||||
errorRate: sumBy(e, 'errorRate'),
|
||||
callRate: sumBy(e, 'callRate'),
|
||||
}
|
||||
});
|
||||
const highestCallCount = maxBy(items, (e) => e?.callCount)?.callCount;
|
||||
const highestCallRate = maxBy(services, (e) => e?.callRate)?.callRate;
|
||||
|
||||
const divNum = Number(
|
||||
String(1).padEnd(highestCallCount.toString().length, '0'),
|
||||
);
|
||||
|
||||
const links = cloneDeep(items).map((node) => {
|
||||
const { parent, child, callCount } = node;
|
||||
const { parent, child, callCount, callRate, errorRate, p99 } = node;
|
||||
return {
|
||||
source: parent,
|
||||
target: child,
|
||||
value: (100 - callCount / divNum) * 0.03,
|
||||
callRate,
|
||||
errorRate,
|
||||
p99,
|
||||
};
|
||||
});
|
||||
const uniqParent = uniqBy(cloneDeep(items), 'parent').map((e) => e.parent);
|
||||
@ -47,15 +59,10 @@ export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
|
||||
width: MIN_WIDTH,
|
||||
color,
|
||||
nodeVal: MIN_WIDTH,
|
||||
callRate: 0,
|
||||
errorRate: 0,
|
||||
p99: 0,
|
||||
};
|
||||
}
|
||||
if (service.errorRate > 0) {
|
||||
color = isDarkMode ? '#DB836E' : '#F98989';
|
||||
} else if (service.fourXXRate > 0) {
|
||||
color = isDarkMode ? '#C79931' : '#F9DA7B';
|
||||
}
|
||||
const { fontSize, width } = getDimensions(service.callRate, highestCallRate);
|
||||
return {
|
||||
@ -65,9 +72,6 @@ export const getGraphData = (serviceMap, isDarkMode): graphDataType => {
|
||||
width,
|
||||
color,
|
||||
nodeVal: width,
|
||||
callRate: service.callRate.toFixed(2),
|
||||
errorRate: service.errorRate,
|
||||
p99: service.p99,
|
||||
};
|
||||
});
|
||||
return {
|
||||
@ -90,25 +94,31 @@ export const getZoomPx = (): number => {
|
||||
return 190;
|
||||
};
|
||||
|
||||
export const getTooltip = (node: {
|
||||
const getRound2DigitsAfterDecimal = (num: number) => {
|
||||
if (num === 0) {
|
||||
return 0;
|
||||
}
|
||||
return num.toFixed(20).match(/^-?\d*\.?0*\d{0,2}/)[0];
|
||||
}
|
||||
|
||||
export const getTooltip = (link: {
|
||||
p99: number;
|
||||
errorRate: number;
|
||||
callRate: number;
|
||||
id: string;
|
||||
}) => {
|
||||
return `<div style="color:#333333;padding:12px;background: white;border-radius: 2px;">
|
||||
<div style="font-weight:bold; margin-bottom:16px;">${node.id}</div>
|
||||
<div class="keyval">
|
||||
<div class="key">P99 latency:</div>
|
||||
<div class="val">${node.p99 / 1000000}ms</div>
|
||||
<div class="val">${getRound2DigitsAfterDecimal(link.p99/ 1000000)}ms</div>
|
||||
</div>
|
||||
<div class="keyval">
|
||||
<div class="key">Request:</div>
|
||||
<div class="val">${node.callRate}/sec</div>
|
||||
<div class="val">${getRound2DigitsAfterDecimal(link.callRate)}/sec</div>
|
||||
</div>
|
||||
<div class="keyval">
|
||||
<div class="key">Error Rate:</div>
|
||||
<div class="val">${node.errorRate}%</div>
|
||||
<div class="val">${getRound2DigitsAfterDecimal(link.errorRate)}%</div>
|
||||
</div>
|
||||
</div>`;
|
||||
};
|
||||
|
@ -3,7 +3,8 @@
|
||||
// import getExternalError from 'api/metrics/getExternalError';
|
||||
// import getExternalService from 'api/metrics/getExternalService';
|
||||
import getServiceOverview from 'api/metrics/getServiceOverview';
|
||||
import getTopEndPoints from 'api/metrics/getTopEndPoints';
|
||||
import getTopLevelOperations from 'api/metrics/getTopLevelOperations';
|
||||
import getTopOperations from 'api/metrics/getTopOperations';
|
||||
import { AxiosError } from 'axios';
|
||||
import GetMinMax from 'lib/getMinMax';
|
||||
import getStep from 'lib/getStep';
|
||||
@ -46,7 +47,8 @@ export const GetInitialData = (
|
||||
// getExternalErrorResponse,
|
||||
// getExternalServiceResponse,
|
||||
getServiceOverviewResponse,
|
||||
getTopEndPointsResponse,
|
||||
getTopOperationsResponse,
|
||||
getTopLevelOperationsResponse,
|
||||
] = await Promise.all([
|
||||
// getDBOverView({
|
||||
// ...props,
|
||||
@ -67,12 +69,15 @@ export const GetInitialData = (
|
||||
step: getStep({ start: minTime, end: maxTime, inputFormat: 'ns' }),
|
||||
selectedTags: props.selectedTags,
|
||||
}),
|
||||
getTopEndPoints({
|
||||
getTopOperations({
|
||||
end: maxTime,
|
||||
service: props.serviceName,
|
||||
start: minTime,
|
||||
selectedTags: props.selectedTags,
|
||||
}),
|
||||
getTopLevelOperations({
|
||||
service: props.serviceName,
|
||||
}),
|
||||
]);
|
||||
|
||||
if (
|
||||
@ -81,7 +86,8 @@ export const GetInitialData = (
|
||||
// getExternalErrorResponse.statusCode === 200 &&
|
||||
// getExternalServiceResponse.statusCode === 200 &&
|
||||
getServiceOverviewResponse.statusCode === 200 &&
|
||||
getTopEndPointsResponse.statusCode === 200
|
||||
getTopOperationsResponse.statusCode === 200 &&
|
||||
getTopLevelOperationsResponse.statusCode === 200
|
||||
) {
|
||||
dispatch({
|
||||
type: 'GET_INTIAL_APPLICATION_DATA',
|
||||
@ -91,7 +97,8 @@ export const GetInitialData = (
|
||||
// externalError: getExternalErrorResponse.payload,
|
||||
// externalService: getExternalServiceResponse.payload,
|
||||
serviceOverview: getServiceOverviewResponse.payload,
|
||||
topEndPoints: getTopEndPointsResponse.payload,
|
||||
topOperations: getTopOperationsResponse.payload,
|
||||
topLevelOperations: getTopLevelOperationsResponse.payload,
|
||||
},
|
||||
});
|
||||
} else {
|
||||
@ -99,8 +106,9 @@ export const GetInitialData = (
|
||||
type: 'GET_INITIAL_APPLICATION_ERROR',
|
||||
payload: {
|
||||
errorMessage:
|
||||
getTopEndPointsResponse.error ||
|
||||
getTopOperationsResponse.error ||
|
||||
getServiceOverviewResponse.error ||
|
||||
getTopLevelOperationsResponse.error ||
|
||||
// getExternalServiceResponse.error ||
|
||||
// getExternalErrorResponse.error ||
|
||||
// getExternalAverageDurationResponse.error ||
|
||||
|
@ -6,26 +6,16 @@ import { ActionTypes } from './types';
|
||||
|
||||
export interface ServiceMapStore {
|
||||
items: ServicesMapItem[];
|
||||
services: ServicesItem[];
|
||||
loading: boolean;
|
||||
}
|
||||
|
||||
export interface ServicesItem {
|
||||
serviceName: string;
|
||||
p99: number;
|
||||
avgDuration: number;
|
||||
numCalls: number;
|
||||
callRate: number;
|
||||
numErrors: number;
|
||||
errorRate: number;
|
||||
num4XX: number;
|
||||
fourXXRate: number;
|
||||
}
|
||||
|
||||
export interface ServicesMapItem {
|
||||
parent: string;
|
||||
child: string;
|
||||
callCount: number;
|
||||
callRate: number;
|
||||
errorRate: number;
|
||||
p99: number;
|
||||
}
|
||||
|
||||
export interface ServiceMapItemAction {
|
||||
@ -33,11 +23,6 @@ export interface ServiceMapItemAction {
|
||||
payload: ServicesMapItem[];
|
||||
}
|
||||
|
||||
export interface ServicesAction {
|
||||
type: ActionTypes.getServices;
|
||||
payload: ServicesItem[];
|
||||
}
|
||||
|
||||
export interface ServiceMapLoading {
|
||||
type: ActionTypes.serviceMapLoading;
|
||||
payload: {
|
||||
@ -55,19 +40,13 @@ export const getDetailedServiceMapItems = (globalTime: GlobalTime) => {
|
||||
end,
|
||||
tags: [],
|
||||
};
|
||||
const [serviceMapDependenciesResponse, response] = await Promise.all([
|
||||
api.post<ServicesMapItem[]>(`/serviceMapDependencies`, serviceMapPayload),
|
||||
api.post<ServicesItem[]>(`/services`, serviceMapPayload),
|
||||
const [dependencyGraphResponse] = await Promise.all([
|
||||
api.post<ServicesMapItem[]>(`/dependency_graph`, serviceMapPayload),
|
||||
]);
|
||||
|
||||
dispatch<ServicesAction>({
|
||||
type: ActionTypes.getServices,
|
||||
payload: response.data,
|
||||
});
|
||||
|
||||
dispatch<ServiceMapItemAction>({
|
||||
type: ActionTypes.getServiceMapItems,
|
||||
payload: serviceMapDependenciesResponse.data,
|
||||
payload: dependencyGraphResponse.data,
|
||||
});
|
||||
|
||||
dispatch<ServiceMapLoading>({
|
||||
|
@ -1,8 +1,4 @@
|
||||
import {
|
||||
ServiceMapItemAction,
|
||||
ServiceMapLoading,
|
||||
ServicesAction,
|
||||
} from './serviceMap';
|
||||
import { ServiceMapItemAction, ServiceMapLoading } from './serviceMap';
|
||||
import { GetUsageDataAction } from './usage';
|
||||
|
||||
export enum ActionTypes {
|
||||
@ -17,6 +13,5 @@ export enum ActionTypes {
|
||||
|
||||
export type Action =
|
||||
| GetUsageDataAction
|
||||
| ServicesAction
|
||||
| ServiceMapItemAction
|
||||
| ServiceMapLoading;
|
||||
|
@ -18,4 +18,8 @@ const store = createStore(
|
||||
),
|
||||
);
|
||||
|
||||
if (window !== undefined) {
|
||||
window.store = store;
|
||||
}
|
||||
|
||||
export default store;
|
||||
|
@ -10,7 +10,9 @@ const intitalState: GlobalReducer = {
|
||||
maxTime: Date.now() * 1000000,
|
||||
minTime: (Date.now() - 15 * 60 * 1000) * 1000000,
|
||||
loading: true,
|
||||
selectedTime: getDefaultOption(window.location.pathname),
|
||||
selectedTime: getDefaultOption(
|
||||
typeof window !== 'undefined' ? window?.location?.pathname : '',
|
||||
),
|
||||
};
|
||||
|
||||
const globalTimeReducer = (
|
||||
|
@ -21,7 +21,7 @@ const InitialValue: InitialValueTypes = {
|
||||
services: [],
|
||||
dbOverView: [],
|
||||
externalService: [],
|
||||
topEndPoints: [],
|
||||
topOperations: [],
|
||||
externalAverageDuration: [],
|
||||
externalError: [],
|
||||
serviceOverview: [],
|
||||
@ -29,6 +29,7 @@ const InitialValue: InitialValueTypes = {
|
||||
resourceAttributePromQLQuery: resourceAttributesQueryToPromQL(
|
||||
GetResourceAttributeQueriesFromURL() || [],
|
||||
),
|
||||
topLevelOperations: [],
|
||||
};
|
||||
|
||||
const metrics = (
|
||||
@ -88,22 +89,24 @@ const metrics = (
|
||||
case GET_INTIAL_APPLICATION_DATA: {
|
||||
const {
|
||||
// dbOverView,
|
||||
topEndPoints,
|
||||
topOperations,
|
||||
serviceOverview,
|
||||
// externalService,
|
||||
// externalAverageDuration,
|
||||
// externalError,
|
||||
topLevelOperations,
|
||||
} = action.payload;
|
||||
|
||||
return {
|
||||
...state,
|
||||
// dbOverView,
|
||||
topEndPoints,
|
||||
topOperations,
|
||||
serviceOverview,
|
||||
// externalService,
|
||||
// externalAverageDuration,
|
||||
// externalError,
|
||||
metricsApplicationLoading: false,
|
||||
topLevelOperations,
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -2,7 +2,6 @@ import { Action, ActionTypes, ServiceMapStore } from 'store/actions';
|
||||
|
||||
const initialState: ServiceMapStore = {
|
||||
items: [],
|
||||
services: [],
|
||||
loading: true,
|
||||
};
|
||||
|
||||
@ -16,11 +15,6 @@ export const ServiceMapReducer = (
|
||||
...state,
|
||||
items: action.payload,
|
||||
};
|
||||
case ActionTypes.getServices:
|
||||
return {
|
||||
...state,
|
||||
services: action.payload,
|
||||
};
|
||||
case ActionTypes.serviceMapLoading: {
|
||||
return {
|
||||
...state,
|
||||
|
@ -5,7 +5,7 @@
|
||||
import { IResourceAttributeQuery } from 'container/MetricsApplication/ResourceAttributesFilter/types';
|
||||
import { ServicesList } from 'types/api/metrics/getService';
|
||||
import { ServiceOverview } from 'types/api/metrics/getServiceOverview';
|
||||
import { TopEndPoints } from 'types/api/metrics/getTopEndPoints';
|
||||
import { TopOperations } from 'types/api/metrics/getTopOperations';
|
||||
|
||||
export const GET_SERVICE_LIST_SUCCESS = 'GET_SERVICE_LIST_SUCCESS';
|
||||
export const GET_SERVICE_LIST_LOADING_START = 'GET_SERVICE_LIST_LOADING_START';
|
||||
@ -38,12 +38,13 @@ export interface GetServiceListError {
|
||||
export interface GetInitialApplicationData {
|
||||
type: typeof GET_INTIAL_APPLICATION_DATA;
|
||||
payload: {
|
||||
topEndPoints: TopEndPoints[];
|
||||
topOperations: TopOperations[];
|
||||
// dbOverView: DBOverView[];
|
||||
// externalService: ExternalService[];
|
||||
// externalAverageDuration: ExternalAverageDuration[];
|
||||
// externalError: ExternalError[];
|
||||
serviceOverview: ServiceOverview[];
|
||||
topLevelOperations: string[];
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -18,6 +18,8 @@ export interface AlertDef {
|
||||
annotations?: Labels;
|
||||
evalWindow?: string;
|
||||
source?: string;
|
||||
disabled?: boolean;
|
||||
preferredChannels?: string[];
|
||||
}
|
||||
|
||||
export interface RuleCondition {
|
||||
|
@ -1,7 +1,7 @@
|
||||
import { Alerts } from './getAll';
|
||||
import { AlertDef } from './def';
|
||||
|
||||
export interface Props {
|
||||
id: Alerts['id'];
|
||||
id: AlertDef['id'];
|
||||
}
|
||||
|
||||
export interface PayloadProps {
|
||||
|
@ -4,6 +4,13 @@ export interface Props {
|
||||
id: AlertDef['id'];
|
||||
}
|
||||
|
||||
export interface GettableAlert extends AlertDef {
|
||||
id: number;
|
||||
alert: string;
|
||||
state: string;
|
||||
disabled: boolean;
|
||||
}
|
||||
|
||||
export type PayloadProps = {
|
||||
data: AlertDef;
|
||||
data: GettableAlert;
|
||||
};
|
||||
|
@ -1,32 +1,3 @@
|
||||
export interface Alerts {
|
||||
labels: AlertsLabel;
|
||||
annotations: {
|
||||
description: string;
|
||||
summary: string;
|
||||
[key: string]: string;
|
||||
};
|
||||
state: string;
|
||||
name: string;
|
||||
id: number;
|
||||
endsAt: string;
|
||||
fingerprint: string;
|
||||
generatorURL: string;
|
||||
receivers: Receivers[];
|
||||
startsAt: string;
|
||||
status: {
|
||||
inhibitedBy: [];
|
||||
silencedBy: [];
|
||||
state: string;
|
||||
};
|
||||
updatedAt: string;
|
||||
}
|
||||
import { GettableAlert } from './get';
|
||||
|
||||
interface Receivers {
|
||||
name: string;
|
||||
}
|
||||
|
||||
interface AlertsLabel {
|
||||
[key: string]: string;
|
||||
}
|
||||
|
||||
export type PayloadProps = Alerts[];
|
||||
export type PayloadProps = GettableAlert[];
|
||||
|
@ -1,4 +1,4 @@
|
||||
import { Alerts } from './getAll';
|
||||
import { AlertDef } from './def';
|
||||
|
||||
export interface Props {
|
||||
silenced: boolean;
|
||||
@ -7,8 +7,8 @@ export interface Props {
|
||||
[key: string]: string | boolean;
|
||||
}
|
||||
export interface Group {
|
||||
alerts: Alerts[];
|
||||
label: Alerts['labels'];
|
||||
alerts: AlertDef[];
|
||||
label: AlertDef['labels'];
|
||||
receiver: {
|
||||
[key: string]: string;
|
||||
};
|
||||
|
@ -1,4 +1,33 @@
|
||||
import { Alerts } from './getAll';
|
||||
export interface Alerts {
|
||||
labels: AlertsLabel;
|
||||
annotations: {
|
||||
description: string;
|
||||
summary: string;
|
||||
[key: string]: string;
|
||||
};
|
||||
state: string;
|
||||
name: string;
|
||||
id: number;
|
||||
endsAt: string;
|
||||
fingerprint: string;
|
||||
generatorURL: string;
|
||||
receivers: Receivers[];
|
||||
startsAt: string;
|
||||
status: {
|
||||
inhibitedBy: [];
|
||||
silencedBy: [];
|
||||
state: string;
|
||||
};
|
||||
updatedAt: string;
|
||||
}
|
||||
|
||||
interface Receivers {
|
||||
name: string;
|
||||
}
|
||||
|
||||
interface AlertsLabel {
|
||||
[key: string]: string;
|
||||
}
|
||||
|
||||
export interface Props {
|
||||
silenced: boolean;
|
||||
|
12
frontend/src/types/api/alerts/patch.ts
Normal file
@ -0,0 +1,12 @@
|
||||
import { GettableAlert } from './get';
|
||||
|
||||
export type PayloadProps = GettableAlert;
|
||||
|
||||
export interface PatchProps {
|
||||
disabled?: boolean;
|
||||
}
|
||||
|
||||
export interface Props {
|
||||
id?: number;
|
||||
data: PatchProps;
|
||||
}
|
10
frontend/src/types/api/alerts/testAlert.ts
Normal file
@ -0,0 +1,10 @@
|
||||
import { AlertDef } from 'types/api/alerts/def';
|
||||
|
||||
export interface Props {
|
||||
data: AlertDef;
|
||||
}
|
||||
|
||||
export interface PayloadProps {
|
||||
alertCount: number;
|
||||
message: string;
|
||||
}
|
7
frontend/src/types/api/metrics/getTopLevelOperations.ts
Normal file
@ -0,0 +1,7 @@
|
||||
export type TopLevelOperations = string[];
|
||||
|
||||
export interface Props {
|
||||
service: string;
|
||||
}
|
||||
|
||||
export type PayloadProps = TopLevelOperations;
|
@ -1,6 +1,6 @@
|
||||
import { Tags } from 'types/reducer/trace';
|
||||
|
||||
export interface TopEndPoints {
|
||||
export interface TopOperations {
|
||||
name: string;
|
||||
numCalls: number;
|
||||
p50: number;
|
||||
@ -15,4 +15,4 @@ export interface Props {
|
||||
selectedTags: Tags[];
|
||||
}
|
||||
|
||||
export type PayloadProps = TopEndPoints[];
|
||||
export type PayloadProps = TopOperations[];
|
@ -5,7 +5,7 @@ import { ExternalError } from 'types/api/metrics/getExternalError';
|
||||
import { ExternalService } from 'types/api/metrics/getExternalService';
|
||||
import { ServicesList } from 'types/api/metrics/getService';
|
||||
import { ServiceOverview } from 'types/api/metrics/getServiceOverview';
|
||||
import { TopEndPoints } from 'types/api/metrics/getTopEndPoints';
|
||||
import { TopOperations } from 'types/api/metrics/getTopOperations';
|
||||
|
||||
interface MetricReducer {
|
||||
services: ServicesList[];
|
||||
@ -15,12 +15,13 @@ interface MetricReducer {
|
||||
errorMessage: string;
|
||||
dbOverView: DBOverView[];
|
||||
externalService: ExternalService[];
|
||||
topEndPoints: TopEndPoints[];
|
||||
topOperations: TopOperations[];
|
||||
externalAverageDuration: ExternalAverageDuration[];
|
||||
externalError: ExternalError[];
|
||||
serviceOverview: ServiceOverview[];
|
||||
resourceAttributeQueries: IResourceAttributeQuery[];
|
||||
resourceAttributePromQLQuery: string;
|
||||
topLevelOperations: string[];
|
||||
}
|
||||
|
||||
export default MetricReducer;
|
||||
|
101
frontend/tests/expectionDetails/index.spec.ts
Normal file
@ -0,0 +1,101 @@
|
||||
import { expect, Page, test } from '@playwright/test';
|
||||
import ROUTES from 'constants/routes';
|
||||
|
||||
import allErrorList from '../fixtures/api/allErrors/200.json';
|
||||
import errorDetailSuccess from '../fixtures/api/errorDetails/200.json';
|
||||
import errorDetailNotFound from '../fixtures/api/errorDetails/404.json';
|
||||
import nextPreviousSuccess from '../fixtures/api/getNextPrev/200.json';
|
||||
import { loginApi } from '../fixtures/common';
|
||||
import { JsonApplicationType } from '../fixtures/constant';
|
||||
|
||||
let page: Page;
|
||||
const timestamp = '1657794588955274000';
|
||||
|
||||
test.describe('Expections Details', async () => {
|
||||
test.beforeEach(async ({ baseURL, browser }) => {
|
||||
const context = await browser.newContext({ storageState: 'tests/auth.json' });
|
||||
const newPage = await context.newPage();
|
||||
|
||||
await loginApi(newPage);
|
||||
|
||||
await newPage.goto(`${baseURL}${ROUTES.APPLICATION}`);
|
||||
|
||||
page = newPage;
|
||||
});
|
||||
|
||||
test('Should have not found when api return 404', async () => {
|
||||
await Promise.all([
|
||||
page.route('**/errorFromGroupID**', (route) =>
|
||||
route.fulfill({
|
||||
status: 404,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify(errorDetailNotFound),
|
||||
}),
|
||||
),
|
||||
page.route('**/nextPrevErrorIDs**', (route) =>
|
||||
route.fulfill({
|
||||
status: 404,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify([]),
|
||||
}),
|
||||
),
|
||||
]);
|
||||
|
||||
await page.goto(
|
||||
`${ROUTES.ERROR_DETAIL}?groupId=${allErrorList[0].groupID}×tamp=${timestamp}`,
|
||||
{
|
||||
waitUntil: 'networkidle',
|
||||
},
|
||||
);
|
||||
|
||||
const NoDataLocator = page.locator('text=Not Found');
|
||||
const isVisible = await NoDataLocator.isVisible();
|
||||
const text = await NoDataLocator.textContent();
|
||||
|
||||
expect(isVisible).toBe(true);
|
||||
expect(text).toBe('Not Found');
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Render Success Data when 200 from details page', async () => {
|
||||
await Promise.all([
|
||||
page.route('**/errorFromGroupID**', (route) =>
|
||||
route.fulfill({
|
||||
status: 200,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify(errorDetailSuccess),
|
||||
}),
|
||||
),
|
||||
page.route('**/nextPrevErrorIDs**', (route) =>
|
||||
route.fulfill({
|
||||
status: 200,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify(nextPreviousSuccess),
|
||||
}),
|
||||
),
|
||||
]);
|
||||
|
||||
await page.goto(
|
||||
`${ROUTES.ERROR_DETAIL}?groupId=${allErrorList[0].groupID}×tamp=${timestamp}`,
|
||||
{
|
||||
waitUntil: 'networkidle',
|
||||
},
|
||||
);
|
||||
|
||||
const traceDetailButton = page.locator('text=See the error in trace graph');
|
||||
const olderButton = page.locator('text=Older');
|
||||
const newerButton = page.locator(`text=Newer`);
|
||||
|
||||
expect(await traceDetailButton.isVisible()).toBe(true);
|
||||
expect(await olderButton.isVisible()).toBe(true);
|
||||
expect(await newerButton.isVisible()).toBe(true);
|
||||
|
||||
expect(await traceDetailButton.textContent()).toBe(
|
||||
'See the error in trace graph',
|
||||
);
|
||||
expect(await olderButton.textContent()).toBe('Older');
|
||||
expect(await newerButton.textContent()).toBe('Newer');
|
||||
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
});
|
After Width: | Height: | Size: 183 KiB |
After Width: | Height: | Size: 39 KiB |
148
frontend/tests/expections/index.spec.ts
Normal file
@ -0,0 +1,148 @@
|
||||
import { expect, Page, test } from '@playwright/test';
|
||||
import ROUTES from 'constants/routes';
|
||||
|
||||
import successAllErrors from '../fixtures/api/allErrors/200.json';
|
||||
import { loginApi } from '../fixtures/common';
|
||||
import { JsonApplicationType } from '../fixtures/constant';
|
||||
|
||||
const noDataTableData = async (page: Page): Promise<void> => {
|
||||
const text = page.locator('text=No Data');
|
||||
|
||||
expect(text).toBeVisible();
|
||||
expect(text).toHaveText('No Data');
|
||||
|
||||
const textType = [
|
||||
'Exception Type',
|
||||
'Error Message',
|
||||
'Last Seen',
|
||||
'First Seen',
|
||||
'Application',
|
||||
];
|
||||
|
||||
textType.forEach(async (text) => {
|
||||
const textLocator = page.locator(`text=${text}`);
|
||||
|
||||
const textContent = await textLocator.textContent();
|
||||
|
||||
expect(textContent).toBe(text);
|
||||
expect(textLocator).not.toBeNull();
|
||||
|
||||
expect(textLocator).toBeVisible();
|
||||
await expect(textLocator).toHaveText(`${text}`);
|
||||
});
|
||||
};
|
||||
|
||||
let page: Page;
|
||||
|
||||
test.describe('Expections page', async () => {
|
||||
test.beforeEach(async ({ baseURL, browser }) => {
|
||||
const context = await browser.newContext({ storageState: 'tests/auth.json' });
|
||||
const newPage = await context.newPage();
|
||||
|
||||
await loginApi(newPage);
|
||||
|
||||
await newPage.goto(`${baseURL}${ROUTES.APPLICATION}`);
|
||||
|
||||
page = newPage;
|
||||
});
|
||||
|
||||
test('Should have a valid route', async () => {
|
||||
await page.goto(ROUTES.ALL_ERROR);
|
||||
|
||||
await expect(page).toHaveURL(ROUTES.ALL_ERROR);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Should have a valid Breadcrumbs', async () => {
|
||||
await page.goto(ROUTES.ALL_ERROR, {
|
||||
waitUntil: 'networkidle',
|
||||
});
|
||||
|
||||
const expectionsLocator = page.locator('a:has-text("Exceptions")');
|
||||
|
||||
await expect(expectionsLocator).toBeVisible();
|
||||
await expect(expectionsLocator).toHaveText('Exceptions');
|
||||
await expect(expectionsLocator).toHaveAttribute('href', ROUTES.ALL_ERROR);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Should render the page with 404 status', async () => {
|
||||
await page.route('**listErrors', (route) =>
|
||||
route.fulfill({
|
||||
status: 404,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify([]),
|
||||
}),
|
||||
);
|
||||
|
||||
await page.goto(ROUTES.ALL_ERROR, {
|
||||
waitUntil: 'networkidle',
|
||||
});
|
||||
|
||||
await noDataTableData(page);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Should render the page with 500 status in antd notification with no data antd table', async () => {
|
||||
await page.route(`**/listErrors**`, (route) =>
|
||||
route.fulfill({
|
||||
status: 500,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify([]),
|
||||
}),
|
||||
);
|
||||
|
||||
await page.goto(ROUTES.ALL_ERROR, {
|
||||
waitUntil: 'networkidle',
|
||||
});
|
||||
|
||||
const text = 'Something went wrong';
|
||||
|
||||
const el = page.locator(`text=${text}`);
|
||||
|
||||
expect(el).toBeVisible();
|
||||
expect(el).toHaveText(`${text}`);
|
||||
expect(await el.getAttribute('disabled')).toBe(null);
|
||||
|
||||
await noDataTableData(page);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Should render data in antd table', async () => {
|
||||
await Promise.all([
|
||||
page.route(`**/listErrors**`, (route) =>
|
||||
route.fulfill({
|
||||
status: 200,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify(successAllErrors),
|
||||
}),
|
||||
),
|
||||
|
||||
page.route('**/countErrors**', (route) =>
|
||||
route.fulfill({
|
||||
status: 200,
|
||||
contentType: JsonApplicationType,
|
||||
body: JSON.stringify(200),
|
||||
}),
|
||||
),
|
||||
]);
|
||||
|
||||
await page.goto(ROUTES.ALL_ERROR, {
|
||||
waitUntil: 'networkidle',
|
||||
});
|
||||
|
||||
await page.evaluate(() => window.scrollTo(0, document.body.scrollHeight));
|
||||
|
||||
const expectionType = page.locator(
|
||||
`td:has-text("${successAllErrors[1].exceptionType}")`,
|
||||
);
|
||||
|
||||
expect(expectionType).toBeVisible();
|
||||
|
||||
const second = page.locator('li > a:has-text("2") >> nth=0');
|
||||
const isVisisble = await second.isVisible();
|
||||
|
||||
expect(isVisisble).toBe(true);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
});
|
After Width: | Height: | Size: 60 KiB |
After Width: | Height: | Size: 42 KiB |
After Width: | Height: | Size: 115 KiB |
After Width: | Height: | Size: 60 KiB |
After Width: | Height: | Size: 58 KiB |
92
frontend/tests/fixtures/api/allErrors/200.json
vendored
Normal file
@ -0,0 +1,92 @@
|
||||
[
|
||||
{
|
||||
"exceptionType": "ConnectionError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='run.mocekdy.io', port=443): Max retries exceeded with url: /v3/1cwb67153-a6ac-4aae-aca6-273ed68b5d9e (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ce9c10\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 2,
|
||||
"lastSeen": "2022-07-14T10:29:48.955274Z",
|
||||
"firstSeen": "2022-07-14T10:29:48.950721Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "e24d35bda98c5499a5c8df3ba61b0238"
|
||||
},
|
||||
{
|
||||
"exceptionType": "NameError",
|
||||
"exceptionMessage": "name 'listf' is not defined",
|
||||
"exceptionCount": 8,
|
||||
"lastSeen": "2022-07-14T10:30:42.411035Z",
|
||||
"firstSeen": "2022-07-14T10:29:45.426784Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "efc46adcd5e87b65f8f244cba683b265"
|
||||
},
|
||||
{
|
||||
"exceptionType": "ZeroDivisionError",
|
||||
"exceptionMessage": "division by zero",
|
||||
"exceptionCount": 1,
|
||||
"lastSeen": "2022-07-14T10:29:54.195996Z",
|
||||
"firstSeen": "2022-07-14T10:29:54.195996Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "a49058b540eef9aefe159d84f1a2b6df"
|
||||
},
|
||||
{
|
||||
"exceptionType": "MaxRetryError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ec2640\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 1,
|
||||
"lastSeen": "2022-07-14T10:29:49.471402Z",
|
||||
"firstSeen": "2022-07-14T10:29:49.471402Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "e59d39239f4d48842d83e3cc4cf53249"
|
||||
},
|
||||
{
|
||||
"exceptionType": "MaxRetryError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='run.mocekdy.io', port=443): Max retries exceeded with url: /v3/1cwb67153-a6ac-4aae-aca6-273ed68b5d9e (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ce9c10\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 1,
|
||||
"lastSeen": "2022-07-14T10:29:48.947579Z",
|
||||
"firstSeen": "2022-07-14T10:29:48.947579Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "14d18a6fb1cd3f541de1566530e75486"
|
||||
},
|
||||
{
|
||||
"exceptionType": "ConnectionError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108ec2640\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 2,
|
||||
"lastSeen": "2022-07-14T10:29:49.476718Z",
|
||||
"firstSeen": "2022-07-14T10:29:49.472271Z",
|
||||
"serviceName": "1rfflaskAp",
|
||||
"groupID": "bf6d88d10397ca3194b96a10f4719031"
|
||||
},
|
||||
{
|
||||
"exceptionType": "github.com/gin-gonic/gin.Error",
|
||||
"exceptionMessage": "Sample Error",
|
||||
"exceptionCount": 6,
|
||||
"lastSeen": "2022-07-15T18:55:32.3538096Z",
|
||||
"firstSeen": "2022-07-14T14:47:19.874387Z",
|
||||
"serviceName": "goApp",
|
||||
"groupID": "b4fd099280072d45318e1523d82aa9c1"
|
||||
},
|
||||
{
|
||||
"exceptionType": "MaxRetryError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x10801b490\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 1,
|
||||
"lastSeen": "2022-07-14T11:07:06.560593Z",
|
||||
"firstSeen": "2022-07-14T11:07:06.560593Z",
|
||||
"serviceName": "samplFlaskApp",
|
||||
"groupID": "1945671c945b10641e73b0fe28c4d486"
|
||||
},
|
||||
{
|
||||
"exceptionType": "ConnectionError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x10801b490\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 2,
|
||||
"lastSeen": "2022-07-14T11:07:06.56493Z",
|
||||
"firstSeen": "2022-07-14T11:07:06.561074Z",
|
||||
"serviceName": "samplFlaskApp",
|
||||
"groupID": "5bea5295cac187404005f9c96e71aa53"
|
||||
},
|
||||
{
|
||||
"exceptionType": "ConnectionError",
|
||||
"exceptionMessage": "HTTPSConnectionPool(host='rufn.fmoceky.io', port=443): Max retries exceeded with url: /v3/b851a5c6-ab54-495a-be04-69834ae0d2a7 (Caused by NewConnectionError('\u003curllib3.connection.HTTPSConnection object at 0x108031820\u003e: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))",
|
||||
"exceptionCount": 2,
|
||||
"lastSeen": "2022-07-14T11:07:06.363977Z",
|
||||
"firstSeen": "2022-07-14T11:07:06.361163Z",
|
||||
"serviceName": "samplFlaskApp",
|
||||
"groupID": "52a1fbe033453d806c0f24ba39168a78"
|
||||
}
|
||||
]
|
12
frontend/tests/fixtures/api/errorDetails/200.json
vendored
Normal file
5
frontend/tests/fixtures/api/errorDetails/404.json
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
{
|
||||
"error": "Error/Exception not found",
|
||||
"errorType": "not_found",
|
||||
"status": "error"
|
||||
}
|
7
frontend/tests/fixtures/api/getNextPrev/200.json
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
{
|
||||
"nextErrorID": "",
|
||||
"nextTimestamp": "0001-01-01T00:00:00Z",
|
||||
"prevErrorID": "217133e5f7df429abd31b507859ea513",
|
||||
"prevTimestamp": "2022-07-14T10:29:48.950721Z",
|
||||
"groupID": "e24d35bda98c5499a5c8df3ba61b0238"
|
||||
}
|
2
frontend/tests/fixtures/constant.ts
vendored
@ -6,3 +6,5 @@ export const validPassword = 'SamplePassword98@@';
|
||||
|
||||
export const getStartedButtonSelector = 'button[data-attr="signup"]';
|
||||
export const confirmPasswordSelector = '#password-confirm-error';
|
||||
|
||||
export const JsonApplicationType = 'application/json';
|
||||
|
@ -24,5 +24,6 @@ test.describe('Version API fail while loading login page', async () => {
|
||||
expect(el).toBeVisible();
|
||||
expect(el).toHaveText(`${text}`);
|
||||
expect(await el.getAttribute('disabled')).toBe(null);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
});
|
||||
|
After Width: | Height: | Size: 7.9 KiB |
@ -45,5 +45,6 @@ test.describe('Login Page', () => {
|
||||
element.isVisible();
|
||||
const text = await element.innerText();
|
||||
expect(text).toBe(`SigNoz ${version}`);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
});
|
||||
|
After Width: | Height: | Size: 46 KiB |
@ -16,7 +16,17 @@ test.describe('Service Page', () => {
|
||||
|
||||
page = newPage;
|
||||
});
|
||||
|
||||
test('Serice Page is rendered', async ({ baseURL }) => {
|
||||
await expect(page).toHaveURL(`${baseURL}${ROUTES.APPLICATION}`);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Logged In must be true', async () => {
|
||||
const { app } = await page.evaluate(() => window.store.getState());
|
||||
|
||||
const { isLoggedIn } = app;
|
||||
|
||||
expect(isLoggedIn).toBe(true);
|
||||
});
|
||||
});
|
||||
|
After Width: | Height: | Size: 40 KiB |
@ -77,6 +77,7 @@ test.describe('Sign Up Page', () => {
|
||||
await buttonSignupButton.click();
|
||||
|
||||
expect(page).toHaveURL(`${baseURL}${ROUTES.SIGN_UP}`);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Invite link validation', async ({ baseURL, page }) => {
|
||||
@ -87,6 +88,7 @@ test.describe('Sign Up Page', () => {
|
||||
const messageText = await page.locator(`text=${message}`).innerText();
|
||||
|
||||
expect(messageText).toBe(message);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('User Sign up with valid details', async ({ baseURL, page, context }) => {
|
||||
@ -125,6 +127,7 @@ test.describe('Sign Up Page', () => {
|
||||
await context.storageState({
|
||||
path: 'tests/auth.json',
|
||||
});
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Empty name with valid details', async ({ baseURL, page }) => {
|
||||
@ -142,6 +145,7 @@ test.describe('Sign Up Page', () => {
|
||||
const gettingStartedButton = page.locator(getStartedButtonSelector);
|
||||
|
||||
expect(await gettingStartedButton.isDisabled()).toBe(true);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Empty Company name with valid details', async ({ baseURL, page }) => {
|
||||
@ -159,6 +163,7 @@ test.describe('Sign Up Page', () => {
|
||||
const gettingStartedButton = page.locator(getStartedButtonSelector);
|
||||
|
||||
expect(await gettingStartedButton.isDisabled()).toBe(true);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Empty Email with valid details', async ({ baseURL, page }) => {
|
||||
@ -176,6 +181,7 @@ test.describe('Sign Up Page', () => {
|
||||
const gettingStartedButton = page.locator(getStartedButtonSelector);
|
||||
|
||||
expect(await gettingStartedButton.isDisabled()).toBe(true);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Empty Password and confirm password with valid details', async ({
|
||||
@ -200,6 +206,7 @@ test.describe('Sign Up Page', () => {
|
||||
// password validation message is not present
|
||||
const locator = await page.locator(confirmPasswordSelector).isVisible();
|
||||
expect(locator).toBe(false);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
|
||||
test('Miss Match Password and confirm password with valid details', async ({
|
||||
@ -220,5 +227,6 @@ test.describe('Sign Up Page', () => {
|
||||
// password validation message is not present
|
||||
const locator = await page.locator(confirmPasswordSelector).isVisible();
|
||||
expect(locator).toBe(true);
|
||||
expect(await page.screenshot()).toMatchSnapshot();
|
||||
});
|
||||
});
|
||||
|
After Width: | Height: | Size: 70 KiB |
After Width: | Height: | Size: 70 KiB |
After Width: | Height: | Size: 69 KiB |
After Width: | Height: | Size: 70 KiB |
After Width: | Height: | Size: 66 KiB |
After Width: | Height: | Size: 72 KiB |
After Width: | Height: | Size: 54 KiB |
After Width: | Height: | Size: 70 KiB |
@ -20,7 +20,7 @@ RUN go mod download -x
|
||||
|
||||
# Add the sources and proceed with build
|
||||
ADD . .
|
||||
RUN go build -a -ldflags "-linkmode external -extldflags '-static' -s -w $LD_FLAGS" -o ./bin/query-service ./main.go
|
||||
RUN go build -tags timetzdata -a -ldflags "-linkmode external -extldflags '-static' -s -w $LD_FLAGS" -o ./bin/query-service ./main.go
|
||||
RUN chmod +x ./bin/query-service
|
||||
|
||||
|
||||
|
@ -6,8 +6,37 @@ Query service is the interface between frontend and databases. It is written in
|
||||
- parse response from databases and handle error if any
|
||||
- clickhouse response in the format accepted by Frontend
|
||||
|
||||
# Complete the clickhouse setup locally.
|
||||
https://github.com/SigNoz/signoz/blob/main/CONTRIBUTING.md#to-run-clickhouse-setup-recommended-for-local-development
|
||||
|
||||
- Comment out the query-service and the frontend section in `signoz/deploy/docker/clickhouse-setup/docker-compose.yaml`
|
||||
- Change the alertmanager section in `signoz/deploy/docker/clickhouse-setup/docker-compose.yaml` as follows:
|
||||
```console
|
||||
alertmanager:
|
||||
image: signoz/alertmanager:0.23.0-0.1
|
||||
volumes:
|
||||
- ./data/alertmanager:/data
|
||||
expose:
|
||||
- "9093"
|
||||
ports:
|
||||
- "8080:9093"
|
||||
# depends_on:
|
||||
# query-service:
|
||||
# condition: service_healthy
|
||||
restart: on-failure
|
||||
command:
|
||||
- --queryService.url=http://172.17.0.1:8085
|
||||
- --storage.path=/data
|
||||
```
|
||||
- Run the following:
|
||||
```console
|
||||
cd signoz/
|
||||
If you are using x86_64 processors (All Intel/AMD processors) run sudo make run-x86
|
||||
If you are on arm64 processors (Apple M1 Macs) run sudo make run-arm
|
||||
```
|
||||
|
||||
#### Backend Configuration
|
||||
|
||||
#### Configuration
|
||||
- Open ./constants/constants.go
|
||||
- Replace ```const RELATIONAL_DATASOURCE_PATH = "/var/lib/signoz/signoz.db"``` \
|
||||
with ```const RELATIONAL_DATASOURCE_PATH = "./signoz.db".```
|
||||
@ -15,8 +44,9 @@ Query service is the interface between frontend and databases. It is written in
|
||||
- Query Service needs below `env` variables to run:
|
||||
|
||||
```
|
||||
ClickHouseUrl=tcp://localhost:9001
|
||||
STORAGE=clickhouse
|
||||
export ClickHouseUrl=tcp://localhost:9001
|
||||
export STORAGE=clickhouse
|
||||
export ALERTMANAGER_API_PREFIX=http://localhost:9093/api/
|
||||
```
|
||||
|
||||
<!-- The above values are the default ones used by SigNoz and are kept at `deploy/kubernetes/platform/signoz-charts/query-service/values.yaml` -->
|
||||
@ -28,5 +58,24 @@ go build -o build/query-service main.go
|
||||
ClickHouseUrl=tcp://localhost:9001 STORAGE=clickhouse build/query-service
|
||||
```
|
||||
|
||||
# Frontend Configuration for local query-service.
|
||||
|
||||
- Set the following environment variables
|
||||
```console
|
||||
export FRONTEND_API_ENDPOINT=http://localhost:8080
|
||||
```
|
||||
|
||||
- Run the following
|
||||
```console
|
||||
cd signoz\frontend\
|
||||
yarn install
|
||||
yarn dev
|
||||
```
|
||||
|
||||
## Note:
|
||||
If you use go version 1.18 for development and contributions, then please checkout the following issue.
|
||||
https://github.com/SigNoz/signoz/issues/1371
|
||||
|
||||
|
||||
#### Docker Images
|
||||
The docker images of query-service is available at https://hub.docker.com/r/signoz/query-service
|
||||
|
@ -18,16 +18,19 @@ const (
|
||||
)
|
||||
|
||||
const (
|
||||
defaultDatasource string = "tcp://localhost:9000"
|
||||
defaultTraceDB string = "signoz_traces"
|
||||
defaultOperationsTable string = "signoz_operations"
|
||||
defaultIndexTable string = "signoz_index_v2"
|
||||
defaultErrorTable string = "signoz_error_index_v2"
|
||||
defaulDurationTable string = "durationSortMV"
|
||||
defaultSpansTable string = "signoz_spans"
|
||||
defaultWriteBatchDelay time.Duration = 5 * time.Second
|
||||
defaultWriteBatchSize int = 10000
|
||||
defaultEncoding Encoding = EncodingJSON
|
||||
defaultDatasource string = "tcp://localhost:9000"
|
||||
defaultTraceDB string = "signoz_traces"
|
||||
defaultOperationsTable string = "signoz_operations"
|
||||
defaultIndexTable string = "signoz_index_v2"
|
||||
defaultErrorTable string = "signoz_error_index_v2"
|
||||
defaultDurationTable string = "durationSortMV"
|
||||
defaultUsageExplorerTable string = "usage_explorer"
|
||||
defaultSpansTable string = "signoz_spans"
|
||||
defaultDependencyGraphTable string = "dependency_graph_minutes"
|
||||
defaultTopLevelOperationsTable string = "top_level_operations"
|
||||
defaultWriteBatchDelay time.Duration = 5 * time.Second
|
||||
defaultWriteBatchSize int = 10000
|
||||
defaultEncoding Encoding = EncodingJSON
|
||||
)
|
||||
|
||||
const (
|
||||
@ -43,19 +46,22 @@ const (
|
||||
|
||||
// NamespaceConfig is Clickhouse's internal configuration data
|
||||
type namespaceConfig struct {
|
||||
namespace string
|
||||
Enabled bool
|
||||
Datasource string
|
||||
TraceDB string
|
||||
OperationsTable string
|
||||
IndexTable string
|
||||
DurationTable string
|
||||
SpansTable string
|
||||
ErrorTable string
|
||||
WriteBatchDelay time.Duration
|
||||
WriteBatchSize int
|
||||
Encoding Encoding
|
||||
Connector Connector
|
||||
namespace string
|
||||
Enabled bool
|
||||
Datasource string
|
||||
TraceDB string
|
||||
OperationsTable string
|
||||
IndexTable string
|
||||
DurationTable string
|
||||
UsageExplorerTable string
|
||||
SpansTable string
|
||||
ErrorTable string
|
||||
DependencyGraphTable string
|
||||
TopLevelOperationsTable string
|
||||
WriteBatchDelay time.Duration
|
||||
WriteBatchSize int
|
||||
Encoding Encoding
|
||||
Connector Connector
|
||||
}
|
||||
|
||||
// Connecto defines how to connect to the database
|
||||
@ -102,19 +108,22 @@ func NewOptions(datasource string, primaryNamespace string, otherNamespaces ...s
|
||||
|
||||
options := &Options{
|
||||
primary: &namespaceConfig{
|
||||
namespace: primaryNamespace,
|
||||
Enabled: true,
|
||||
Datasource: datasource,
|
||||
TraceDB: defaultTraceDB,
|
||||
OperationsTable: defaultOperationsTable,
|
||||
IndexTable: defaultIndexTable,
|
||||
ErrorTable: defaultErrorTable,
|
||||
DurationTable: defaulDurationTable,
|
||||
SpansTable: defaultSpansTable,
|
||||
WriteBatchDelay: defaultWriteBatchDelay,
|
||||
WriteBatchSize: defaultWriteBatchSize,
|
||||
Encoding: defaultEncoding,
|
||||
Connector: defaultConnector,
|
||||
namespace: primaryNamespace,
|
||||
Enabled: true,
|
||||
Datasource: datasource,
|
||||
TraceDB: defaultTraceDB,
|
||||
OperationsTable: defaultOperationsTable,
|
||||
IndexTable: defaultIndexTable,
|
||||
ErrorTable: defaultErrorTable,
|
||||
DurationTable: defaultDurationTable,
|
||||
UsageExplorerTable: defaultUsageExplorerTable,
|
||||
SpansTable: defaultSpansTable,
|
||||
DependencyGraphTable: defaultDependencyGraphTable,
|
||||
TopLevelOperationsTable: defaultTopLevelOperationsTable,
|
||||
WriteBatchDelay: defaultWriteBatchDelay,
|
||||
WriteBatchSize: defaultWriteBatchSize,
|
||||
Encoding: defaultEncoding,
|
||||
Connector: defaultConnector,
|
||||
},
|
||||
others: make(map[string]*namespaceConfig, len(otherNamespaces)),
|
||||
}
|
||||
|
@ -47,16 +47,17 @@ import (
|
||||
)
|
||||
|
||||
const (
|
||||
primaryNamespace = "clickhouse"
|
||||
archiveNamespace = "clickhouse-archive"
|
||||
signozTraceDBName = "signoz_traces"
|
||||
signozDurationMVTable = "durationSort"
|
||||
signozSpansTable = "signoz_spans"
|
||||
signozErrorIndexTable = "signoz_error_index_v2"
|
||||
signozTraceTableName = "signoz_index_v2"
|
||||
signozMetricDBName = "signoz_metrics"
|
||||
signozSampleTableName = "samples_v2"
|
||||
signozTSTableName = "time_series_v2"
|
||||
primaryNamespace = "clickhouse"
|
||||
archiveNamespace = "clickhouse-archive"
|
||||
signozTraceDBName = "signoz_traces"
|
||||
signozDurationMVTable = "durationSort"
|
||||
signozUsageExplorerTable = "usage_explorer"
|
||||
signozSpansTable = "signoz_spans"
|
||||
signozErrorIndexTable = "signoz_error_index_v2"
|
||||
signozTraceTableName = "signoz_index_v2"
|
||||
signozMetricDBName = "signoz_metrics"
|
||||
signozSampleTableName = "samples_v2"
|
||||
signozTSTableName = "time_series_v2"
|
||||
|
||||
minTimespanForProgressiveSearch = time.Hour
|
||||
minTimespanForProgressiveSearchMargin = time.Minute
|
||||
@ -75,16 +76,19 @@ var (
|
||||
|
||||
// SpanWriter for reading spans from ClickHouse
|
||||
type ClickHouseReader struct {
|
||||
db clickhouse.Conn
|
||||
localDB *sqlx.DB
|
||||
traceDB string
|
||||
operationsTable string
|
||||
durationTable string
|
||||
indexTable string
|
||||
errorTable string
|
||||
spansTable string
|
||||
queryEngine *promql.Engine
|
||||
remoteStorage *remote.Storage
|
||||
db clickhouse.Conn
|
||||
localDB *sqlx.DB
|
||||
traceDB string
|
||||
operationsTable string
|
||||
durationTable string
|
||||
usageExplorerTable string
|
||||
indexTable string
|
||||
errorTable string
|
||||
spansTable string
|
||||
dependencyGraphTable string
|
||||
topLevelOperationsTable string
|
||||
queryEngine *promql.Engine
|
||||
remoteStorage *remote.Storage
|
||||
|
||||
promConfigFile string
|
||||
promConfig *config.Config
|
||||
@ -111,16 +115,19 @@ func NewReader(localDB *sqlx.DB, configFile string) *ClickHouseReader {
|
||||
}
|
||||
|
||||
return &ClickHouseReader{
|
||||
db: db,
|
||||
localDB: localDB,
|
||||
traceDB: options.primary.TraceDB,
|
||||
alertManager: alertManager,
|
||||
operationsTable: options.primary.OperationsTable,
|
||||
indexTable: options.primary.IndexTable,
|
||||
errorTable: options.primary.ErrorTable,
|
||||
durationTable: options.primary.DurationTable,
|
||||
spansTable: options.primary.SpansTable,
|
||||
promConfigFile: configFile,
|
||||
db: db,
|
||||
localDB: localDB,
|
||||
traceDB: options.primary.TraceDB,
|
||||
alertManager: alertManager,
|
||||
operationsTable: options.primary.OperationsTable,
|
||||
indexTable: options.primary.IndexTable,
|
||||
errorTable: options.primary.ErrorTable,
|
||||
usageExplorerTable: options.primary.UsageExplorerTable,
|
||||
durationTable: options.primary.DurationTable,
|
||||
spansTable: options.primary.SpansTable,
|
||||
dependencyGraphTable: options.primary.DependencyGraphTable,
|
||||
topLevelOperationsTable: options.primary.TopLevelOperationsTable,
|
||||
promConfigFile: configFile,
|
||||
}
|
||||
}
|
||||
|
||||
@ -374,14 +381,21 @@ func (r *ClickHouseReader) GetChannel(id string) (*model.ChannelItem, *model.Api
|
||||
idInt, _ := strconv.Atoi(id)
|
||||
channel := model.ChannelItem{}
|
||||
|
||||
query := fmt.Sprintf("SELECT id, created_at, updated_at, name, type, data data FROM notification_channels WHERE id=%d", idInt)
|
||||
query := "SELECT id, created_at, updated_at, name, type, data data FROM notification_channels WHERE id=? "
|
||||
|
||||
err := r.localDB.Get(&channel, query)
|
||||
stmt, err := r.localDB.Preparex(query)
|
||||
|
||||
zap.S().Info(query)
|
||||
zap.S().Info(query, idInt)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
zap.S().Debug("Error in preparing sql query for GetChannel : ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: err}
|
||||
}
|
||||
|
||||
err = stmt.Get(&channel, idInt)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug(fmt.Sprintf("Error in getting channel with id=%d : ", idInt), err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: err}
|
||||
}
|
||||
|
||||
@ -650,103 +664,153 @@ func (r *ClickHouseReader) GetServicesList(ctx context.Context) (*[]string, erro
|
||||
return &services, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetTopLevelOperations(ctx context.Context) (*map[string][]string, *model.ApiError) {
|
||||
|
||||
operations := map[string][]string{}
|
||||
query := fmt.Sprintf(`SELECT DISTINCT name, serviceName FROM %s.%s`, r.traceDB, r.topLevelOperationsTable)
|
||||
|
||||
rows, err := r.db.Query(ctx, query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
defer rows.Close()
|
||||
for rows.Next() {
|
||||
var name, serviceName string
|
||||
if err := rows.Scan(&name, &serviceName); err != nil {
|
||||
return nil, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("Error in reading data")}
|
||||
}
|
||||
if _, ok := operations[serviceName]; !ok {
|
||||
operations[serviceName] = []string{}
|
||||
}
|
||||
operations[serviceName] = append(operations[serviceName], name)
|
||||
}
|
||||
return &operations, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetServices(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceItem, *model.ApiError) {
|
||||
|
||||
if r.indexTable == "" {
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: ErrNoIndexTable}
|
||||
}
|
||||
|
||||
topLevelOps, apiErr := r.GetTopLevelOperations(ctx)
|
||||
if apiErr != nil {
|
||||
return nil, apiErr
|
||||
}
|
||||
|
||||
serviceItems := []model.ServiceItem{}
|
||||
var wg sync.WaitGroup
|
||||
// limit the number of concurrent queries to not overload the clickhouse server
|
||||
sem := make(chan struct{}, 10)
|
||||
var mtx sync.RWMutex
|
||||
|
||||
query := fmt.Sprintf("SELECT serviceName, quantile(0.99)(durationNano) as p99, avg(durationNano) as avgDuration, count(*) as numCalls FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2'", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
args := []interface{}{}
|
||||
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
for svc, ops := range *topLevelOps {
|
||||
sem <- struct{}{}
|
||||
wg.Add(1)
|
||||
go func(svc string, ops []string) {
|
||||
defer wg.Done()
|
||||
defer func() { <-sem }()
|
||||
var serviceItem model.ServiceItem
|
||||
var numErrors uint64
|
||||
query := fmt.Sprintf(
|
||||
`SELECT
|
||||
quantile(0.99)(durationNano) as p99,
|
||||
avg(durationNano) as avgDuration,
|
||||
count(*) as numCalls
|
||||
FROM %s.%s
|
||||
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end`,
|
||||
r.traceDB, r.indexTable,
|
||||
)
|
||||
errorQuery := fmt.Sprintf(
|
||||
`SELECT
|
||||
count(*) as numErrors
|
||||
FROM %s.%s
|
||||
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end AND statusCode=2`,
|
||||
r.traceDB, r.indexTable,
|
||||
)
|
||||
|
||||
args := []interface{}{}
|
||||
args = append(args,
|
||||
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
|
||||
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
|
||||
clickhouse.Named("serviceName", svc),
|
||||
clickhouse.Named("names", ops),
|
||||
)
|
||||
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
zap.S().Error("Error in processing sql query: ", errStatus)
|
||||
return
|
||||
}
|
||||
err := r.db.QueryRow(
|
||||
ctx,
|
||||
query,
|
||||
args...,
|
||||
).ScanStruct(&serviceItem)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return
|
||||
}
|
||||
|
||||
err = r.db.QueryRow(ctx, errorQuery, args...).Scan(&numErrors)
|
||||
if err != nil {
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return
|
||||
}
|
||||
|
||||
serviceItem.ServiceName = svc
|
||||
serviceItem.NumErrors = numErrors
|
||||
mtx.Lock()
|
||||
serviceItems = append(serviceItems, serviceItem)
|
||||
mtx.Unlock()
|
||||
}(svc, ops)
|
||||
}
|
||||
query += " GROUP BY serviceName ORDER BY p99 DESC"
|
||||
err := r.db.Select(ctx, &serviceItems, query, args...)
|
||||
wg.Wait()
|
||||
|
||||
zap.S().Info(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
for idx := range serviceItems {
|
||||
serviceItems[idx].CallRate = float64(serviceItems[idx].NumCalls) / float64(queryParams.Period)
|
||||
serviceItems[idx].ErrorRate = float64(serviceItems[idx].NumErrors) * 100 / float64(serviceItems[idx].NumCalls)
|
||||
}
|
||||
|
||||
////////////////// Below block gets 5xx of services
|
||||
serviceErrorItems := []model.ServiceItem{}
|
||||
|
||||
query = fmt.Sprintf("SELECT serviceName, count(*) as numErrors FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND (statusCode>=500 OR statusCode=2)", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
args = []interface{}{}
|
||||
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
}
|
||||
query += " GROUP BY serviceName"
|
||||
err = r.db.Select(ctx, &serviceErrorItems, query, args...)
|
||||
|
||||
zap.S().Info(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
m5xx := make(map[string]uint64)
|
||||
|
||||
for j := range serviceErrorItems {
|
||||
m5xx[serviceErrorItems[j].ServiceName] = serviceErrorItems[j].NumErrors
|
||||
}
|
||||
///////////////////////////////////////////
|
||||
|
||||
////////////////// Below block gets 4xx of services
|
||||
|
||||
service4xxItems := []model.ServiceItem{}
|
||||
|
||||
query = fmt.Sprintf("SELECT serviceName, count(*) as num4xx FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND statusCode>=400 AND statusCode<500", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
args = []interface{}{}
|
||||
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
}
|
||||
query += " GROUP BY serviceName"
|
||||
err = r.db.Select(ctx, &service4xxItems, query, args...)
|
||||
|
||||
zap.S().Info(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
m4xx := make(map[string]uint64)
|
||||
|
||||
for j := range service4xxItems {
|
||||
m4xx[service4xxItems[j].ServiceName] = service4xxItems[j].Num4XX
|
||||
}
|
||||
|
||||
for i := range serviceItems {
|
||||
if val, ok := m5xx[serviceItems[i].ServiceName]; ok {
|
||||
serviceItems[i].NumErrors = val
|
||||
}
|
||||
if val, ok := m4xx[serviceItems[i].ServiceName]; ok {
|
||||
serviceItems[i].Num4XX = val
|
||||
}
|
||||
serviceItems[i].CallRate = float64(serviceItems[i].NumCalls) / float64(queryParams.Period)
|
||||
serviceItems[i].FourXXRate = float64(serviceItems[i].Num4XX) * 100 / float64(serviceItems[i].NumCalls)
|
||||
serviceItems[i].ErrorRate = float64(serviceItems[i].NumErrors) * 100 / float64(serviceItems[i].NumCalls)
|
||||
}
|
||||
|
||||
return &serviceItems, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *model.GetServiceOverviewParams) (*[]model.ServiceOverviewItem, *model.ApiError) {
|
||||
|
||||
topLevelOps, apiErr := r.GetTopLevelOperations(ctx)
|
||||
if apiErr != nil {
|
||||
return nil, apiErr
|
||||
}
|
||||
ops, ok := (*topLevelOps)[queryParams.ServiceName]
|
||||
if !ok {
|
||||
return nil, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("Service not found")}
|
||||
}
|
||||
|
||||
namedArgs := []interface{}{
|
||||
clickhouse.Named("interval", strconv.Itoa(int(queryParams.StepSeconds/60))),
|
||||
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
|
||||
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
|
||||
clickhouse.Named("serviceName", queryParams.ServiceName),
|
||||
clickhouse.Named("names", ops),
|
||||
}
|
||||
|
||||
serviceOverviewItems := []model.ServiceOverviewItem{}
|
||||
|
||||
query := fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %s minute) as time, quantile(0.99)(durationNano) as p99, quantile(0.95)(durationNano) as p95,quantile(0.50)(durationNano) as p50, count(*) as numCalls FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND serviceName='%s'", strconv.Itoa(int(queryParams.StepSeconds/60)), r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName)
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
toStartOfInterval(timestamp, INTERVAL @interval minute) as time,
|
||||
quantile(0.99)(durationNano) as p99,
|
||||
quantile(0.95)(durationNano) as p95,
|
||||
quantile(0.50)(durationNano) as p50,
|
||||
count(*) as numCalls
|
||||
FROM %s.%s
|
||||
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end`,
|
||||
r.traceDB, r.indexTable,
|
||||
)
|
||||
args := []interface{}{}
|
||||
args = append(args, namedArgs...)
|
||||
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
@ -754,17 +818,25 @@ func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *
|
||||
query += " GROUP BY time ORDER BY time DESC"
|
||||
err := r.db.Select(ctx, &serviceOverviewItems, query, args...)
|
||||
|
||||
zap.S().Info(query)
|
||||
zap.S().Debug(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
serviceErrorItems := []model.ServiceErrorItem{}
|
||||
|
||||
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %s minute) as time, count(*) as numErrors FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' AND kind='2' AND serviceName='%s' AND hasError=true", strconv.Itoa(int(queryParams.StepSeconds/60)), r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName)
|
||||
query = fmt.Sprintf(`
|
||||
SELECT
|
||||
toStartOfInterval(timestamp, INTERVAL @interval minute) as time,
|
||||
count(*) as numErrors
|
||||
FROM %s.%s
|
||||
WHERE serviceName = @serviceName AND name In [@names] AND timestamp>= @start AND timestamp<= @end AND statusCode=2`,
|
||||
r.traceDB, r.indexTable,
|
||||
)
|
||||
args = []interface{}{}
|
||||
args = append(args, namedArgs...)
|
||||
args, errStatus = buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
@ -772,10 +844,10 @@ func (r *ClickHouseReader) GetServiceOverview(ctx context.Context, queryParams *
|
||||
query += " GROUP BY time ORDER BY time DESC"
|
||||
err = r.db.Select(ctx, &serviceErrorItems, query, args...)
|
||||
|
||||
zap.S().Info(query)
|
||||
zap.S().Debug(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
@ -1516,45 +1588,67 @@ func (r *ClickHouseReader) GetTagValues(ctx context.Context, queryParams *model.
|
||||
return &cleanedTagValues, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetTopEndpoints(ctx context.Context, queryParams *model.GetTopEndpointsParams) (*[]model.TopEndpointsItem, *model.ApiError) {
|
||||
func (r *ClickHouseReader) GetTopOperations(ctx context.Context, queryParams *model.GetTopOperationsParams) (*[]model.TopOperationsItem, *model.ApiError) {
|
||||
|
||||
var topEndpointsItems []model.TopEndpointsItem
|
||||
namedArgs := []interface{}{
|
||||
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
|
||||
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
|
||||
clickhouse.Named("serviceName", queryParams.ServiceName),
|
||||
}
|
||||
|
||||
query := fmt.Sprintf("SELECT quantile(0.5)(durationNano) as p50, quantile(0.95)(durationNano) as p95, quantile(0.99)(durationNano) as p99, COUNT(1) as numCalls, name FROM %s.%s WHERE timestamp >= '%s' AND timestamp <= '%s' AND kind='2' and serviceName='%s'", r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10), queryParams.ServiceName)
|
||||
var topOperationsItems []model.TopOperationsItem
|
||||
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
quantile(0.5)(durationNano) as p50,
|
||||
quantile(0.95)(durationNano) as p95,
|
||||
quantile(0.99)(durationNano) as p99,
|
||||
COUNT(*) as numCalls,
|
||||
name
|
||||
FROM %s.%s
|
||||
WHERE serviceName = @serviceName AND timestamp>= @start AND timestamp<= @end`,
|
||||
r.traceDB, r.indexTable,
|
||||
)
|
||||
args := []interface{}{}
|
||||
args = append(args, namedArgs...)
|
||||
args, errStatus := buildQueryWithTagParams(ctx, queryParams.Tags, &query, args)
|
||||
if errStatus != nil {
|
||||
return nil, errStatus
|
||||
}
|
||||
query += " GROUP BY name"
|
||||
err := r.db.Select(ctx, &topEndpointsItems, query, args...)
|
||||
query += " GROUP BY name ORDER BY p99 DESC LIMIT 10"
|
||||
err := r.db.Select(ctx, &topOperationsItems, query, args...)
|
||||
|
||||
zap.S().Info(query)
|
||||
zap.S().Debug(query)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return nil, &model.ApiError{Typ: model.ErrorExec, Err: fmt.Errorf("Error in processing sql query")}
|
||||
}
|
||||
|
||||
if topEndpointsItems == nil {
|
||||
topEndpointsItems = []model.TopEndpointsItem{}
|
||||
if topOperationsItems == nil {
|
||||
topOperationsItems = []model.TopOperationsItem{}
|
||||
}
|
||||
|
||||
return &topEndpointsItems, nil
|
||||
return &topOperationsItems, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetUsage(ctx context.Context, queryParams *model.GetUsageParams) (*[]model.UsageItem, error) {
|
||||
|
||||
var usageItems []model.UsageItem
|
||||
|
||||
namedArgs := []interface{}{
|
||||
clickhouse.Named("interval", queryParams.StepHour),
|
||||
clickhouse.Named("start", strconv.FormatInt(queryParams.Start.UnixNano(), 10)),
|
||||
clickhouse.Named("end", strconv.FormatInt(queryParams.End.UnixNano(), 10)),
|
||||
}
|
||||
var query string
|
||||
if len(queryParams.ServiceName) != 0 {
|
||||
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d HOUR) as time, count(1) as count FROM %s.%s WHERE serviceName='%s' AND timestamp>='%s' AND timestamp<='%s' GROUP BY time ORDER BY time ASC", queryParams.StepHour, r.traceDB, r.indexTable, queryParams.ServiceName, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
namedArgs = append(namedArgs, clickhouse.Named("serviceName", queryParams.ServiceName))
|
||||
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL @interval HOUR) as time, sum(count) as count FROM %s.%s WHERE service_name=@serviceName AND timestamp>=@start AND timestamp<=@end GROUP BY time ORDER BY time ASC", r.traceDB, r.usageExplorerTable)
|
||||
} else {
|
||||
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d HOUR) as time, count(1) as count FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s' GROUP BY time ORDER BY time ASC", queryParams.StepHour, r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
query = fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL @interval HOUR) as time, sum(count) as count FROM %s.%s WHERE timestamp>=@start AND timestamp<=@end GROUP BY time ORDER BY time ASC", r.traceDB, r.usageExplorerTable)
|
||||
}
|
||||
|
||||
err := r.db.Select(ctx, &usageItems, query)
|
||||
err := r.db.Select(ctx, &usageItems, query, namedArgs...)
|
||||
|
||||
zap.S().Info(query)
|
||||
|
||||
@ -1614,48 +1708,50 @@ func interfaceArrayToStringArray(array []interface{}) []string {
|
||||
return strArray
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetServiceMapDependencies(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceMapDependencyResponseItem, error) {
|
||||
serviceMapDependencyItems := []model.ServiceMapDependencyItem{}
|
||||
func (r *ClickHouseReader) GetDependencyGraph(ctx context.Context, queryParams *model.GetServicesParams) (*[]model.ServiceMapDependencyResponseItem, error) {
|
||||
|
||||
query := fmt.Sprintf(`SELECT spanID, parentSpanID, serviceName FROM %s.%s WHERE timestamp>='%s' AND timestamp<='%s'`, r.traceDB, r.indexTable, strconv.FormatInt(queryParams.Start.UnixNano(), 10), strconv.FormatInt(queryParams.End.UnixNano(), 10))
|
||||
response := []model.ServiceMapDependencyResponseItem{}
|
||||
|
||||
err := r.db.Select(ctx, &serviceMapDependencyItems, query)
|
||||
args := []interface{}{}
|
||||
args = append(args,
|
||||
clickhouse.Named("start", uint64(queryParams.Start.Unix())),
|
||||
clickhouse.Named("end", uint64(queryParams.End.Unix())),
|
||||
clickhouse.Named("duration", uint64(queryParams.End.Unix()-queryParams.Start.Unix())),
|
||||
)
|
||||
|
||||
zap.S().Info(query)
|
||||
query := fmt.Sprintf(`
|
||||
WITH
|
||||
quantilesMergeState(0.5, 0.75, 0.9, 0.95, 0.99)(duration_quantiles_state) AS duration_quantiles_state,
|
||||
finalizeAggregation(duration_quantiles_state) AS result
|
||||
SELECT
|
||||
src as parent,
|
||||
dest as child,
|
||||
result[1] AS p50,
|
||||
result[2] AS p75,
|
||||
result[3] AS p90,
|
||||
result[4] AS p95,
|
||||
result[5] AS p99,
|
||||
sum(total_count) as callCount,
|
||||
sum(total_count)/ @duration AS callRate,
|
||||
sum(error_count)/sum(total_count) as errorRate
|
||||
FROM %s.%s
|
||||
WHERE toUInt64(toDateTime(timestamp)) >= @start AND toUInt64(toDateTime(timestamp)) <= @end
|
||||
GROUP BY
|
||||
src,
|
||||
dest`,
|
||||
r.traceDB, r.dependencyGraphTable,
|
||||
)
|
||||
|
||||
zap.S().Debug(query, args)
|
||||
|
||||
err := r.db.Select(ctx, &response, query, args...)
|
||||
|
||||
if err != nil {
|
||||
zap.S().Debug("Error in processing sql query: ", err)
|
||||
zap.S().Error("Error in processing sql query: ", err)
|
||||
return nil, fmt.Errorf("Error in processing sql query")
|
||||
}
|
||||
|
||||
serviceMap := make(map[string]*model.ServiceMapDependencyResponseItem)
|
||||
|
||||
spanId2ServiceNameMap := make(map[string]string)
|
||||
for i := range serviceMapDependencyItems {
|
||||
spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId] = serviceMapDependencyItems[i].ServiceName
|
||||
}
|
||||
for i := range serviceMapDependencyItems {
|
||||
parent2childServiceName := spanId2ServiceNameMap[serviceMapDependencyItems[i].ParentSpanId] + "-" + spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId]
|
||||
if _, ok := serviceMap[parent2childServiceName]; !ok {
|
||||
serviceMap[parent2childServiceName] = &model.ServiceMapDependencyResponseItem{
|
||||
Parent: spanId2ServiceNameMap[serviceMapDependencyItems[i].ParentSpanId],
|
||||
Child: spanId2ServiceNameMap[serviceMapDependencyItems[i].SpanId],
|
||||
CallCount: 1,
|
||||
}
|
||||
} else {
|
||||
serviceMap[parent2childServiceName].CallCount++
|
||||
}
|
||||
}
|
||||
|
||||
retMe := make([]model.ServiceMapDependencyResponseItem, 0, len(serviceMap))
|
||||
for _, dependency := range serviceMap {
|
||||
if dependency.Parent == "" {
|
||||
continue
|
||||
}
|
||||
retMe = append(retMe, *dependency)
|
||||
}
|
||||
|
||||
return &retMe, nil
|
||||
return &response, nil
|
||||
}
|
||||
|
||||
func (r *ClickHouseReader) GetFilteredSpansAggregates(ctx context.Context, queryParams *model.GetFilteredSpanAggregatesParams) (*model.GetFilteredSpansAggregatesResponse, *model.ApiError) {
|
||||
@ -1895,7 +1991,7 @@ func (r *ClickHouseReader) SetTTL(ctx context.Context,
|
||||
|
||||
switch params.Type {
|
||||
case constants.TraceTTL:
|
||||
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable}
|
||||
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable, signozTraceDBName + "." + signozUsageExplorerTable, signozTraceDBName + "." + defaultDependencyGraphTable}
|
||||
for _, tableName = range tableNameArray {
|
||||
statusItem, err := r.checkTTLStatusItem(ctx, tableName)
|
||||
if err != nil {
|
||||
@ -2170,7 +2266,7 @@ func (r *ClickHouseReader) GetTTL(ctx context.Context, ttlParams *model.GetTTLPa
|
||||
|
||||
switch ttlParams.Type {
|
||||
case constants.TraceTTL:
|
||||
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable}
|
||||
tableNameArray := []string{signozTraceDBName + "." + signozTraceTableName, signozTraceDBName + "." + signozDurationMVTable, signozTraceDBName + "." + signozSpansTable, signozTraceDBName + "." + signozErrorIndexTable, signozTraceDBName + "." + signozUsageExplorerTable, signozTraceDBName + "." + defaultDependencyGraphTable}
|
||||
status, err := r.setTTLQueryStatus(ctx, tableNameArray)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
|