Merge pull request #6075 from SigNoz/release/v0.55.x

Release/v0.55.x
This commit is contained in:
Prashant Shahi 2024-09-25 22:25:37 +05:30 committed by GitHub
commit 5fa8686fcf
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
205 changed files with 13917 additions and 4619 deletions

View File

@ -0,0 +1,49 @@
---
name: Request Dashboard
about: Request a new dashboard for the SigNoz Dashboards repository
title: '[Dashboard Request] '
labels: 'dashboard-template'
assignees: ''
---
<!-- Use this template to request a new dashboard for the SigNoz Dashboards repository. Providing detailed information will help us understand your needs better and speed up the dashboard creation process. -->
## Dashboard Name
<!-- Provide the name for the requested dashboard. Be specific (e.g., "MySQL Monitoring Dashboard"). -->
## Expected Dashboard Sections and Panels
(Can be tweaked (add or remove panels/sections) according to available metrics)
### Section Name
<!-- Brief description of what this section should display (e.g., "Resource usage metrics for MySQL database"). -->
### Panel Name
<!-- Description of the panel (e.g., "Displays current CPU usage, memory usage, etc."). -->
<!-- - **Example:**
- **Section**: Resource Metrics
- **Panel**: CPU Usage - Displays the current CPU usage across all database instances.
- **Panel**: Memory Usage - Displays the total memory used by the MySQL process. -->
<!-- Repeat this format for any additional sections or panels. -->
## Expected Dashboard Variables
<!-- List any dashboard variables that should be included in the dashboard. Examples could be `deployment.environment`, `hostname`, `region`, etc. -->
## Additional Comments or Requirements
<!-- Include any other details, special requirements, or specific visualizations you'd like to request for this dashboard. -->
## References or Screenshots
<!-- Add any references or screenshots of requested dashboard if available. -->
## 📋 Notes
Please review the [CONTRIBUTING.md](https://github.com/SigNoz/dashboards/blob/main/CONTRIBUTING.md) for guidelines on dashboard structure, naming conventions, and how to submit a pull request.

View File

@ -11,9 +11,9 @@ jobs:
check-no-ee-references:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run check
run: make check-no-ee-references
- uses: actions/checkout@v4
- name: Run check
run: make check-no-ee-references
build-frontend:
runs-on: ubuntu-latest
@ -43,7 +43,6 @@ jobs:
run: |
echo 'INTERCOM_APP_ID="${{ secrets.INTERCOM_APP_ID }}"' > frontend/.env
echo 'SEGMENT_ID="${{ secrets.SEGMENT_ID }}"' >> frontend/.env
echo 'CLARITY_PROJECT_ID="${{ secrets.CLARITY_PROJECT_ID }}"' >> frontend/.env
- name: Install dependencies
run: cd frontend && yarn install
- name: Run ESLint

View File

@ -9,7 +9,6 @@ on:
- v*
jobs:
image-build-and-push-query-service:
runs-on: ubuntu-latest
steps:
@ -151,7 +150,6 @@ jobs:
run: |
echo 'INTERCOM_APP_ID="${{ secrets.INTERCOM_APP_ID }}"' > frontend/.env
echo 'SEGMENT_ID="${{ secrets.SEGMENT_ID }}"' >> frontend/.env
echo 'CLARITY_PROJECT_ID="${{ secrets.CLARITY_PROJECT_ID }}"' >> frontend/.env
echo 'SENTRY_AUTH_TOKEN="${{ secrets.SENTRY_AUTH_TOKEN }}"' >> frontend/.env
echo 'SENTRY_ORG="${{ secrets.SENTRY_ORG }}"' >> frontend/.env
echo 'SENTRY_PROJECT_ID="${{ secrets.SENTRY_PROJECT_ID }}"' >> frontend/.env

View File

@ -30,6 +30,7 @@ Also, have a look at these [good first issues label](https://github.com/SigNoz/s
- [To run ClickHouse setup](#41-to-run-clickhouse-setup-recommended-for-local-development)
- [Contribute to SigNoz Helm Chart](#5-contribute-to-signoz-helm-chart-)
- [To run helm chart for local development](#51-to-run-helm-chart-for-local-development)
- [Contribute to Dashboards](#6-contribute-to-dashboards-)
- [Other Ways to Contribute](#other-ways-to-contribute)
# 1. General Instructions 📝
@ -37,7 +38,7 @@ Also, have a look at these [good first issues label](https://github.com/SigNoz/s
## 1.1 For Creating Issue(s)
Before making any significant changes and before filing a new issue, please check [existing open](https://github.com/SigNoz/signoz/issues?q=is%3Aopen+is%3Aissue), or [recently closed](https://github.com/SigNoz/signoz/issues?q=is%3Aissue+is%3Aclosed) issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can.
**Issue Types** - [Bug Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=) | [Feature Request](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=) | [Performance Issue Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=performance-issue-report.md&title=) | [Report a Security Vulnerability](https://github.com/SigNoz/signoz/security/policy)
**Issue Types** - [Bug Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=) | [Feature Request](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=) | [Performance Issue Report](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=performance-issue-report.md&title=) | [Request Dashboard](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=dashboard-template&projects=&template=request_dashboard.md&title=%5BDashboard+Request%5D+) | [Report a Security Vulnerability](https://github.com/SigNoz/signoz/security/policy)
#### Details like these are incredibly useful:
@ -56,7 +57,7 @@ Before making any significant changes and before filing a new issue, please chec
Discussing your proposed changes ahead of time will make the contribution
process smooth for everyone 🙌.
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
<hr>
@ -97,13 +98,14 @@ GitHub provides additional document on [forking a repository](https://help.githu
stability and quality of the component.
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [SLACK](https://signoz.io/slack).
You can always reach out to `ankit@signoz.io` to understand more about the repo and product. We are very responsive over email and [slack community](https://signoz.io/slack).
### Pointers:
- If you find any **bugs** → please create an [**issue.**](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=bug_report.md&title=)
- If you find anything **missing** in documentation → you can create an issue with the label **`documentation`**.
- If you want to build any **new feature** → please create an [issue with the label **`enhancement`**.](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=&template=feature_request.md&title=)
- If you want to **discuss** something about the product, start a new [**discussion**.](https://github.com/SigNoz/signoz/discussions)
- If you want to request a new **dashboard template** → please create an issue [here](https://github.com/SigNoz/signoz/issues/new?assignees=&labels=dashboard-template&projects=&template=request_dashboard.md&title=%5BDashboard+Request%5D+).
<hr>
@ -117,7 +119,7 @@ e.g. If you are submitting a fix for an issue in frontend, the PR name should be
- Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
<hr>
@ -127,14 +129,13 @@ e.g. If you are submitting a fix for an issue in frontend, the PR name should be
- [**Frontend**](#3-develop-frontend-) (Written in Typescript, React)
- [**Backend**](#4-contribute-to-backend-query-service-) (Query Service, written in Go)
- [**Dashboard Templates**](#6-contribute-to-dashboards-) (JSON dashboard templates built with SigNoz)
Depending upon your area of expertise & interest, you can choose one or more to contribute. Below are detailed instructions to contribute in each area.
**Please note:** If you want to work on an issue, please ask the maintainers to assign the issue to you before starting work on it. This would help us understand who is working on an issue and prevent duplicate work. 🙏🏻
**Please note:** If you want to work on an issue, please add a brief description of your solution on the issue before starting work on it.
⚠️ If you just raise a PR, without the corresponding issue being assigned to you - it may not be accepted.
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
<hr>
@ -188,7 +189,7 @@ Also, have a look at [Frontend README.md](https://github.com/SigNoz/signoz/blob/
### Important Notes:
The Maintainers / Contributors who will change Line Numbers of `Frontend` & `Query-Section`, please update line numbers in [`/.scripts/commentLinesForSetup.sh`](https://github.com/SigNoz/signoz/blob/develop/.scripts/commentLinesForSetup.sh)
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
## 3.2 Contribute to Frontend without installing SigNoz backend
@ -209,7 +210,7 @@ Please ping us in the [`#contributing`](https://signoz-community.slack.com/archi
**Frontend should now be accessible at** [`http://localhost:3301/services`](http://localhost:3301/services)
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
<hr>
@ -309,7 +310,7 @@ Click the button below. A workspace with all required environments will be creat
> To use it on your forked repo, edit the 'Open in Gitpod' button URL to `https://gitpod.io/#https://github.com/<your-github-username>/signoz` -->
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
<hr>
@ -365,10 +366,21 @@ curl -sL https://github.com/SigNoz/signoz/raw/develop/sample-apps/hotrod/hotrod-
| HOTROD_NAMESPACE=sample-application bash
```
**[`^top^`](#)**
**[`^top^`](#contributing-guidelines)**
---
# 6. Contribute to Dashboards 📈
**Need to Update: [https://github.com/SigNoz/dashboards](https://github.com/SigNoz/dashboards)**
To contribute a new dashboard template for any service, follow the contribution guidelines in the [Dashboard Contributing Guide](https://github.com/SigNoz/dashboards/blob/main/CONTRIBUTING.md). In brief:
1. Create a dashboard JSON file.
2. Add a README file explaining the dashboard, the metrics ingested, and the configurations needed.
3. Include screenshots of the dashboard in the `assets/` directory.
4. Submit a pull request for review.
## Other Ways to Contribute
There are many other ways to get involved with the community and to participate in this project:
@ -379,7 +391,6 @@ There are many other ways to get involved with the community and to participate
- Help answer questions on forums such as Stack Overflow and [SigNoz Community Slack Channel](https://signoz.io/slack).
- Tell others about the project on Twitter, your blog, etc.
Again, Feel free to ping us on [`#contributing`](https://signoz-community.slack.com/archives/C01LWQ8KS7M) or [`#contributing-frontend`](https://signoz-community.slack.com/archives/C027134DM8B) on our slack community if you need any help on this :)
Thank You!

View File

@ -23,6 +23,9 @@
[1]: https://github.com/pocoproject/poco/blob/poco-1.9.4-release/Foundation/include/Poco/Logger.h#L105-L114
-->
<level>information</level>
<formatting>
<type>json</type>
</formatting>
<log>/var/log/clickhouse-server/clickhouse-server.log</log>
<errorlog>/var/log/clickhouse-server/clickhouse-server.err.log</errorlog>
<!-- Rotation policy

View File

@ -146,7 +146,7 @@ services:
condition: on-failure
query-service:
image: signoz/query-service:0.54.0
image: signoz/query-service:0.55.0
command:
[
"-config=/root/config/prometheus.yml",
@ -186,7 +186,7 @@ services:
<<: *db-depend
frontend:
image: signoz/frontend:0.54.0
image: signoz/frontend:0.55.0
deploy:
restart_policy:
condition: on-failure
@ -199,7 +199,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector:
image: signoz/signoz-otel-collector:0.102.8
image: signoz/signoz-otel-collector:0.102.10
command:
[
"--config=/etc/otel-collector-config.yaml",
@ -238,7 +238,7 @@ services:
- query-service
otel-collector-migrator:
image: signoz/signoz-schema-migrator:0.102.8
image: signoz/signoz-schema-migrator:0.102.10
deploy:
restart_policy:
condition: on-failure

View File

@ -154,6 +154,8 @@ extensions:
service:
telemetry:
logs:
encoding: json
metrics:
address: 0.0.0.0:8888
extensions: [health_check, zpages, pprof]

View File

@ -23,6 +23,9 @@
[1]: https://github.com/pocoproject/poco/blob/poco-1.9.4-release/Foundation/include/Poco/Logger.h#L105-L114
-->
<level>information</level>
<formatting>
<type>json</type>
</formatting>
<log>/var/log/clickhouse-server/clickhouse-server.log</log>
<errorlog>/var/log/clickhouse-server/clickhouse-server.err.log</errorlog>
<!-- Rotation policy

View File

@ -66,7 +66,7 @@ services:
- --storage.path=/data
otel-collector-migrator:
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.8}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.10}
container_name: otel-migrator
command:
- "--dsn=tcp://clickhouse:9000"
@ -81,7 +81,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
otel-collector:
container_name: signoz-otel-collector
image: signoz/signoz-otel-collector:0.102.8
image: signoz/signoz-otel-collector:0.102.10
command:
[
"--config=/etc/otel-collector-config.yaml",

View File

@ -164,7 +164,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
query-service:
image: signoz/query-service:${DOCKER_TAG:-0.54.0}
image: signoz/query-service:${DOCKER_TAG:-0.55.0}
container_name: signoz-query-service
command:
[
@ -204,7 +204,7 @@ services:
<<: *db-depend
frontend:
image: signoz/frontend:${DOCKER_TAG:-0.54.0}
image: signoz/frontend:${DOCKER_TAG:-0.55.0}
container_name: signoz-frontend
restart: on-failure
depends_on:
@ -216,7 +216,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector-migrator:
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.8}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.10}
container_name: otel-migrator
command:
- "--dsn=tcp://clickhouse:9000"
@ -230,7 +230,7 @@ services:
otel-collector:
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.102.8}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.102.10}
container_name: signoz-otel-collector
command:
[

View File

@ -164,7 +164,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
query-service:
image: signoz/query-service:${DOCKER_TAG:-0.54.0}
image: signoz/query-service:${DOCKER_TAG:-0.55.0}
container_name: signoz-query-service
command:
[
@ -203,7 +203,7 @@ services:
<<: *db-depend
frontend:
image: signoz/frontend:${DOCKER_TAG:-0.54.0}
image: signoz/frontend:${DOCKER_TAG:-0.55.0}
container_name: signoz-frontend
restart: on-failure
depends_on:
@ -215,7 +215,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector-migrator:
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.8}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-0.102.10}
container_name: otel-migrator
command:
- "--dsn=tcp://clickhouse:9000"
@ -229,7 +229,7 @@ services:
otel-collector:
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.102.8}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.102.10}
container_name: signoz-otel-collector
command:
[

View File

@ -158,6 +158,8 @@ exporters:
service:
telemetry:
logs:
encoding: json
metrics:
address: 0.0.0.0:8888
extensions:

View File

@ -0,0 +1,44 @@
package anomaly
import (
"context"
querierV2 "go.signoz.io/signoz/pkg/query-service/app/querier/v2"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
)
type DailyProvider struct {
BaseSeasonalProvider
}
var _ BaseProvider = (*DailyProvider)(nil)
func (dp *DailyProvider) GetBaseSeasonalProvider() *BaseSeasonalProvider {
return &dp.BaseSeasonalProvider
}
// NewDailyProvider uses the same generic option type
func NewDailyProvider(opts ...GenericProviderOption[*DailyProvider]) *DailyProvider {
dp := &DailyProvider{
BaseSeasonalProvider: BaseSeasonalProvider{},
}
for _, opt := range opts {
opt(dp)
}
dp.querierV2 = querierV2.NewQuerier(querierV2.QuerierOptions{
Reader: dp.reader,
Cache: dp.cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: dp.fluxInterval,
FeatureLookup: dp.ff,
})
return dp
}
func (p *DailyProvider) GetAnomalies(ctx context.Context, req *GetAnomaliesRequest) (*GetAnomaliesResponse, error) {
req.Seasonality = SeasonalityDaily
return p.getAnomalies(ctx, req)
}

View File

@ -0,0 +1,44 @@
package anomaly
import (
"context"
querierV2 "go.signoz.io/signoz/pkg/query-service/app/querier/v2"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
)
type HourlyProvider struct {
BaseSeasonalProvider
}
var _ BaseProvider = (*HourlyProvider)(nil)
func (hp *HourlyProvider) GetBaseSeasonalProvider() *BaseSeasonalProvider {
return &hp.BaseSeasonalProvider
}
// NewHourlyProvider now uses the generic option type
func NewHourlyProvider(opts ...GenericProviderOption[*HourlyProvider]) *HourlyProvider {
hp := &HourlyProvider{
BaseSeasonalProvider: BaseSeasonalProvider{},
}
for _, opt := range opts {
opt(hp)
}
hp.querierV2 = querierV2.NewQuerier(querierV2.QuerierOptions{
Reader: hp.reader,
Cache: hp.cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: hp.fluxInterval,
FeatureLookup: hp.ff,
})
return hp
}
func (p *HourlyProvider) GetAnomalies(ctx context.Context, req *GetAnomaliesRequest) (*GetAnomaliesResponse, error) {
req.Seasonality = SeasonalityHourly
return p.getAnomalies(ctx, req)
}

View File

@ -0,0 +1,248 @@
package anomaly
import (
"math"
"time"
"go.signoz.io/signoz/pkg/query-service/common"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
)
type Seasonality string
const (
SeasonalityHourly Seasonality = "hourly"
SeasonalityDaily Seasonality = "daily"
SeasonalityWeekly Seasonality = "weekly"
)
func (s Seasonality) String() string {
return string(s)
}
var (
oneWeekOffset = 24 * 7 * time.Hour.Milliseconds()
oneDayOffset = 24 * time.Hour.Milliseconds()
oneHourOffset = time.Hour.Milliseconds()
fiveMinOffset = 5 * time.Minute.Milliseconds()
)
func (s Seasonality) IsValid() bool {
switch s {
case SeasonalityHourly, SeasonalityDaily, SeasonalityWeekly:
return true
default:
return false
}
}
type GetAnomaliesRequest struct {
Params *v3.QueryRangeParamsV3
Seasonality Seasonality
}
type GetAnomaliesResponse struct {
Results []*v3.Result
}
// anomalyParams is the params for anomaly detection
// prediction = avg(past_period_query) + avg(current_season_query) - mean(past_season_query, past2_season_query, past3_season_query)
//
// ^ ^
// | |
// (rounded value for past peiod) + (seasonal growth)
//
// score = abs(value - prediction) / stddev (current_season_query)
type anomalyQueryParams struct {
// CurrentPeriodQuery is the query range params for period user is looking at or eval window
// Example: (now-5m, now), (now-30m, now), (now-1h, now)
// The results obtained from this query are used to compare with predicted values
// and to detect anomalies
CurrentPeriodQuery *v3.QueryRangeParamsV3
// PastPeriodQuery is the query range params for past seasonal period
// Example: For weekly seasonality, (now-1w-5m, now-1w)
// : For daily seasonality, (now-1d-5m, now-1d)
// : For hourly seasonality, (now-1h-5m, now-1h)
PastPeriodQuery *v3.QueryRangeParamsV3
// CurrentSeasonQuery is the query range params for current period (seasonal)
// Example: For weekly seasonality, this is the query range params for the (now-1w-5m, now)
// : For daily seasonality, this is the query range params for the (now-1d-5m, now)
// : For hourly seasonality, this is the query range params for the (now-1h-5m, now)
CurrentSeasonQuery *v3.QueryRangeParamsV3
// PastSeasonQuery is the query range params for past seasonal period to the current season
// Example: For weekly seasonality, this is the query range params for the (now-2w-5m, now-1w)
// : For daily seasonality, this is the query range params for the (now-2d-5m, now-1d)
// : For hourly seasonality, this is the query range params for the (now-2h-5m, now-1h)
PastSeasonQuery *v3.QueryRangeParamsV3
// Past2SeasonQuery is the query range params for past 2 seasonal period to the current season
// Example: For weekly seasonality, this is the query range params for the (now-3w-5m, now-2w)
// : For daily seasonality, this is the query range params for the (now-3d-5m, now-2d)
// : For hourly seasonality, this is the query range params for the (now-3h-5m, now-2h)
Past2SeasonQuery *v3.QueryRangeParamsV3
// Past3SeasonQuery is the query range params for past 3 seasonal period to the current season
// Example: For weekly seasonality, this is the query range params for the (now-4w-5m, now-3w)
// : For daily seasonality, this is the query range params for the (now-4d-5m, now-3d)
// : For hourly seasonality, this is the query range params for the (now-4h-5m, now-3h)
Past3SeasonQuery *v3.QueryRangeParamsV3
}
func updateStepInterval(req *v3.QueryRangeParamsV3) {
start := req.Start
end := req.End
req.Step = int64(math.Max(float64(common.MinAllowedStepInterval(start, end)), 60))
for _, q := range req.CompositeQuery.BuilderQueries {
// If the step interval is less than the minimum allowed step interval, set it to the minimum allowed step interval
if minStep := common.MinAllowedStepInterval(start, end); q.StepInterval < minStep {
q.StepInterval = minStep
}
}
}
func prepareAnomalyQueryParams(req *v3.QueryRangeParamsV3, seasonality Seasonality) *anomalyQueryParams {
start := req.Start
end := req.End
currentPeriodQuery := &v3.QueryRangeParamsV3{
Start: start,
End: end,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(currentPeriodQuery)
var pastPeriodStart, pastPeriodEnd int64
switch seasonality {
// for one week period, we fetch the data from the past week with 5 min offset
case SeasonalityWeekly:
pastPeriodStart = start - oneWeekOffset - fiveMinOffset
pastPeriodEnd = end - oneWeekOffset
// for one day period, we fetch the data from the past day with 5 min offset
case SeasonalityDaily:
pastPeriodStart = start - oneDayOffset - fiveMinOffset
pastPeriodEnd = end - oneDayOffset
// for one hour period, we fetch the data from the past hour with 5 min offset
case SeasonalityHourly:
pastPeriodStart = start - oneHourOffset - fiveMinOffset
pastPeriodEnd = end - oneHourOffset
}
pastPeriodQuery := &v3.QueryRangeParamsV3{
Start: pastPeriodStart,
End: pastPeriodEnd,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(pastPeriodQuery)
// seasonality growth trend
var currentGrowthPeriodStart, currentGrowthPeriodEnd int64
switch seasonality {
case SeasonalityWeekly:
currentGrowthPeriodStart = start - oneWeekOffset
currentGrowthPeriodEnd = end
case SeasonalityDaily:
currentGrowthPeriodStart = start - oneDayOffset
currentGrowthPeriodEnd = end
case SeasonalityHourly:
currentGrowthPeriodStart = start - oneHourOffset
currentGrowthPeriodEnd = end
}
currentGrowthQuery := &v3.QueryRangeParamsV3{
Start: currentGrowthPeriodStart,
End: currentGrowthPeriodEnd,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(currentGrowthQuery)
var pastGrowthPeriodStart, pastGrowthPeriodEnd int64
switch seasonality {
case SeasonalityWeekly:
pastGrowthPeriodStart = start - 2*oneWeekOffset
pastGrowthPeriodEnd = start - 1*oneWeekOffset
case SeasonalityDaily:
pastGrowthPeriodStart = start - 2*oneDayOffset
pastGrowthPeriodEnd = start - 1*oneDayOffset
case SeasonalityHourly:
pastGrowthPeriodStart = start - 2*oneHourOffset
pastGrowthPeriodEnd = start - 1*oneHourOffset
}
pastGrowthQuery := &v3.QueryRangeParamsV3{
Start: pastGrowthPeriodStart,
End: pastGrowthPeriodEnd,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(pastGrowthQuery)
var past2GrowthPeriodStart, past2GrowthPeriodEnd int64
switch seasonality {
case SeasonalityWeekly:
past2GrowthPeriodStart = start - 3*oneWeekOffset
past2GrowthPeriodEnd = start - 2*oneWeekOffset
case SeasonalityDaily:
past2GrowthPeriodStart = start - 3*oneDayOffset
past2GrowthPeriodEnd = start - 2*oneDayOffset
case SeasonalityHourly:
past2GrowthPeriodStart = start - 3*oneHourOffset
past2GrowthPeriodEnd = start - 2*oneHourOffset
}
past2GrowthQuery := &v3.QueryRangeParamsV3{
Start: past2GrowthPeriodStart,
End: past2GrowthPeriodEnd,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(past2GrowthQuery)
var past3GrowthPeriodStart, past3GrowthPeriodEnd int64
switch seasonality {
case SeasonalityWeekly:
past3GrowthPeriodStart = start - 4*oneWeekOffset
past3GrowthPeriodEnd = start - 3*oneWeekOffset
case SeasonalityDaily:
past3GrowthPeriodStart = start - 4*oneDayOffset
past3GrowthPeriodEnd = start - 3*oneDayOffset
case SeasonalityHourly:
past3GrowthPeriodStart = start - 4*oneHourOffset
past3GrowthPeriodEnd = start - 3*oneHourOffset
}
past3GrowthQuery := &v3.QueryRangeParamsV3{
Start: past3GrowthPeriodStart,
End: past3GrowthPeriodEnd,
CompositeQuery: req.CompositeQuery.Clone(),
Variables: make(map[string]interface{}, 0),
NoCache: false,
}
updateStepInterval(past3GrowthQuery)
return &anomalyQueryParams{
CurrentPeriodQuery: currentPeriodQuery,
PastPeriodQuery: pastPeriodQuery,
CurrentSeasonQuery: currentGrowthQuery,
PastSeasonQuery: pastGrowthQuery,
Past2SeasonQuery: past2GrowthQuery,
Past3SeasonQuery: past3GrowthQuery,
}
}
type anomalyQueryResults struct {
CurrentPeriodResults []*v3.Result
PastPeriodResults []*v3.Result
CurrentSeasonResults []*v3.Result
PastSeasonResults []*v3.Result
Past2SeasonResults []*v3.Result
Past3SeasonResults []*v3.Result
}

View File

@ -0,0 +1,9 @@
package anomaly
import (
"context"
)
type Provider interface {
GetAnomalies(ctx context.Context, req *GetAnomaliesRequest) (*GetAnomaliesResponse, error)
}

View File

@ -0,0 +1,466 @@
package anomaly
import (
"context"
"math"
"time"
"go.signoz.io/signoz/pkg/query-service/cache"
"go.signoz.io/signoz/pkg/query-service/interfaces"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/postprocess"
"go.signoz.io/signoz/pkg/query-service/utils/labels"
"go.uber.org/zap"
)
var (
// TODO(srikanthccv): make this configurable?
movingAvgWindowSize = 7
)
// BaseProvider is an interface that includes common methods for all provider types
type BaseProvider interface {
GetBaseSeasonalProvider() *BaseSeasonalProvider
}
// GenericProviderOption is a generic type for provider options
type GenericProviderOption[T BaseProvider] func(T)
func WithCache[T BaseProvider](cache cache.Cache) GenericProviderOption[T] {
return func(p T) {
p.GetBaseSeasonalProvider().cache = cache
}
}
func WithKeyGenerator[T BaseProvider](keyGenerator cache.KeyGenerator) GenericProviderOption[T] {
return func(p T) {
p.GetBaseSeasonalProvider().keyGenerator = keyGenerator
}
}
func WithFeatureLookup[T BaseProvider](ff interfaces.FeatureLookup) GenericProviderOption[T] {
return func(p T) {
p.GetBaseSeasonalProvider().ff = ff
}
}
func WithReader[T BaseProvider](reader interfaces.Reader) GenericProviderOption[T] {
return func(p T) {
p.GetBaseSeasonalProvider().reader = reader
}
}
type BaseSeasonalProvider struct {
querierV2 interfaces.Querier
reader interfaces.Reader
fluxInterval time.Duration
cache cache.Cache
keyGenerator cache.KeyGenerator
ff interfaces.FeatureLookup
}
func (p *BaseSeasonalProvider) getQueryParams(req *GetAnomaliesRequest) *anomalyQueryParams {
if !req.Seasonality.IsValid() {
req.Seasonality = SeasonalityDaily
}
return prepareAnomalyQueryParams(req.Params, req.Seasonality)
}
func (p *BaseSeasonalProvider) getResults(ctx context.Context, params *anomalyQueryParams) (*anomalyQueryResults, error) {
zap.L().Info("fetching results for current period", zap.Any("currentPeriodQuery", params.CurrentPeriodQuery))
currentPeriodResults, _, err := p.querierV2.QueryRange(ctx, params.CurrentPeriodQuery)
if err != nil {
return nil, err
}
currentPeriodResults, err = postprocess.PostProcessResult(currentPeriodResults, params.CurrentPeriodQuery)
if err != nil {
return nil, err
}
zap.L().Info("fetching results for past period", zap.Any("pastPeriodQuery", params.PastPeriodQuery))
pastPeriodResults, _, err := p.querierV2.QueryRange(ctx, params.PastPeriodQuery)
if err != nil {
return nil, err
}
pastPeriodResults, err = postprocess.PostProcessResult(pastPeriodResults, params.PastPeriodQuery)
if err != nil {
return nil, err
}
zap.L().Info("fetching results for current season", zap.Any("currentSeasonQuery", params.CurrentSeasonQuery))
currentSeasonResults, _, err := p.querierV2.QueryRange(ctx, params.CurrentSeasonQuery)
if err != nil {
return nil, err
}
currentSeasonResults, err = postprocess.PostProcessResult(currentSeasonResults, params.CurrentSeasonQuery)
if err != nil {
return nil, err
}
zap.L().Info("fetching results for past season", zap.Any("pastSeasonQuery", params.PastSeasonQuery))
pastSeasonResults, _, err := p.querierV2.QueryRange(ctx, params.PastSeasonQuery)
if err != nil {
return nil, err
}
pastSeasonResults, err = postprocess.PostProcessResult(pastSeasonResults, params.PastSeasonQuery)
if err != nil {
return nil, err
}
zap.L().Info("fetching results for past 2 season", zap.Any("past2SeasonQuery", params.Past2SeasonQuery))
past2SeasonResults, _, err := p.querierV2.QueryRange(ctx, params.Past2SeasonQuery)
if err != nil {
return nil, err
}
past2SeasonResults, err = postprocess.PostProcessResult(past2SeasonResults, params.Past2SeasonQuery)
if err != nil {
return nil, err
}
zap.L().Info("fetching results for past 3 season", zap.Any("past3SeasonQuery", params.Past3SeasonQuery))
past3SeasonResults, _, err := p.querierV2.QueryRange(ctx, params.Past3SeasonQuery)
if err != nil {
return nil, err
}
past3SeasonResults, err = postprocess.PostProcessResult(past3SeasonResults, params.Past3SeasonQuery)
if err != nil {
return nil, err
}
return &anomalyQueryResults{
CurrentPeriodResults: currentPeriodResults,
PastPeriodResults: pastPeriodResults,
CurrentSeasonResults: currentSeasonResults,
PastSeasonResults: pastSeasonResults,
Past2SeasonResults: past2SeasonResults,
Past3SeasonResults: past3SeasonResults,
}, nil
}
// getMatchingSeries gets the matching series from the query result
// for the given series
func (p *BaseSeasonalProvider) getMatchingSeries(queryResult *v3.Result, series *v3.Series) *v3.Series {
if queryResult == nil || len(queryResult.Series) == 0 {
return nil
}
for _, curr := range queryResult.Series {
currLabels := labels.FromMap(curr.Labels)
seriesLabels := labels.FromMap(series.Labels)
if currLabels.Hash() == seriesLabels.Hash() {
return curr
}
}
return nil
}
func (p *BaseSeasonalProvider) getAvg(series *v3.Series) float64 {
if series == nil || len(series.Points) == 0 {
return 0
}
var sum float64
for _, smpl := range series.Points {
sum += smpl.Value
}
return sum / float64(len(series.Points))
}
func (p *BaseSeasonalProvider) getStdDev(series *v3.Series) float64 {
if series == nil || len(series.Points) == 0 {
return 0
}
avg := p.getAvg(series)
var sum float64
for _, smpl := range series.Points {
sum += math.Pow(smpl.Value-avg, 2)
}
return math.Sqrt(sum / float64(len(series.Points)))
}
// getMovingAvg gets the moving average for the given series
// for the given window size and start index
func (p *BaseSeasonalProvider) getMovingAvg(series *v3.Series, movingAvgWindowSize, startIdx int) float64 {
if series == nil || len(series.Points) == 0 {
return 0
}
if startIdx >= len(series.Points)-movingAvgWindowSize {
startIdx = int(math.Max(0, float64(len(series.Points)-movingAvgWindowSize)))
}
var sum float64
points := series.Points[startIdx:]
for i := 0; i < movingAvgWindowSize && i < len(points); i++ {
sum += points[i].Value
}
avg := sum / float64(movingAvgWindowSize)
return avg
}
func (p *BaseSeasonalProvider) getMean(floats ...float64) float64 {
if len(floats) == 0 {
return 0
}
var sum float64
for _, f := range floats {
sum += f
}
return sum / float64(len(floats))
}
func (p *BaseSeasonalProvider) getPredictedSeries(
series, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries *v3.Series,
) *v3.Series {
predictedSeries := &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: []v3.Point{},
}
// for each point in the series, get the predicted value
// the predicted value is the moving average (with window size = 7) of the previous period series
// plus the average of the current season series
// minus the mean of the past season series, past2 season series and past3 season series
for idx, curr := range series.Points {
predictedValue :=
p.getMovingAvg(prevSeries, movingAvgWindowSize, idx) +
p.getAvg(currentSeasonSeries) -
p.getMean(p.getAvg(pastSeasonSeries), p.getAvg(past2SeasonSeries), p.getAvg(past3SeasonSeries))
if predictedValue < 0 {
predictedValue = p.getMovingAvg(prevSeries, movingAvgWindowSize, idx)
}
zap.L().Info("predictedSeries",
zap.Float64("movingAvg", p.getMovingAvg(prevSeries, movingAvgWindowSize, idx)),
zap.Float64("avg", p.getAvg(currentSeasonSeries)),
zap.Float64("mean", p.getMean(p.getAvg(pastSeasonSeries), p.getAvg(past2SeasonSeries), p.getAvg(past3SeasonSeries))),
zap.Any("labels", series.Labels),
zap.Float64("predictedValue", predictedValue),
)
predictedSeries.Points = append(predictedSeries.Points, v3.Point{
Timestamp: curr.Timestamp,
Value: predictedValue,
})
}
return predictedSeries
}
// getBounds gets the upper and lower bounds for the given series
// for the given z score threshold
// moving avg of the previous period series + z score threshold * std dev of the series
// moving avg of the previous period series - z score threshold * std dev of the series
func (p *BaseSeasonalProvider) getBounds(
series, predictedSeries *v3.Series,
zScoreThreshold float64,
) (*v3.Series, *v3.Series) {
upperBoundSeries := &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: []v3.Point{},
}
lowerBoundSeries := &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: []v3.Point{},
}
for idx, curr := range series.Points {
upperBound := p.getMovingAvg(predictedSeries, movingAvgWindowSize, idx) + zScoreThreshold*p.getStdDev(series)
lowerBound := p.getMovingAvg(predictedSeries, movingAvgWindowSize, idx) - zScoreThreshold*p.getStdDev(series)
upperBoundSeries.Points = append(upperBoundSeries.Points, v3.Point{
Timestamp: curr.Timestamp,
Value: upperBound,
})
lowerBoundSeries.Points = append(lowerBoundSeries.Points, v3.Point{
Timestamp: curr.Timestamp,
Value: math.Max(lowerBound, 0),
})
}
return upperBoundSeries, lowerBoundSeries
}
// getExpectedValue gets the expected value for the given series
// for the given index
// prevSeriesAvg + currentSeasonSeriesAvg - mean of past season series, past2 season series and past3 season series
func (p *BaseSeasonalProvider) getExpectedValue(
_, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries *v3.Series, idx int,
) float64 {
prevSeriesAvg := p.getMovingAvg(prevSeries, movingAvgWindowSize, idx)
currentSeasonSeriesAvg := p.getAvg(currentSeasonSeries)
pastSeasonSeriesAvg := p.getAvg(pastSeasonSeries)
past2SeasonSeriesAvg := p.getAvg(past2SeasonSeries)
past3SeasonSeriesAvg := p.getAvg(past3SeasonSeries)
return prevSeriesAvg + currentSeasonSeriesAvg - p.getMean(pastSeasonSeriesAvg, past2SeasonSeriesAvg, past3SeasonSeriesAvg)
}
// getScore gets the anomaly score for the given series
// for the given index
// (value - expectedValue) / std dev of the series
func (p *BaseSeasonalProvider) getScore(
series, prevSeries, weekSeries, weekPrevSeries, past2SeasonSeries, past3SeasonSeries *v3.Series, value float64, idx int,
) float64 {
expectedValue := p.getExpectedValue(series, prevSeries, weekSeries, weekPrevSeries, past2SeasonSeries, past3SeasonSeries, idx)
return (value - expectedValue) / p.getStdDev(weekSeries)
}
// getAnomalyScores gets the anomaly scores for the given series
// for the given index
// (value - expectedValue) / std dev of the series
func (p *BaseSeasonalProvider) getAnomalyScores(
series, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries *v3.Series,
) *v3.Series {
anomalyScoreSeries := &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: []v3.Point{},
}
for idx, curr := range series.Points {
anomalyScore := p.getScore(series, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries, curr.Value, idx)
anomalyScoreSeries.Points = append(anomalyScoreSeries.Points, v3.Point{
Timestamp: curr.Timestamp,
Value: anomalyScore,
})
}
return anomalyScoreSeries
}
func (p *BaseSeasonalProvider) getAnomalies(ctx context.Context, req *GetAnomaliesRequest) (*GetAnomaliesResponse, error) {
anomalyParams := p.getQueryParams(req)
anomalyQueryResults, err := p.getResults(ctx, anomalyParams)
if err != nil {
return nil, err
}
currentPeriodResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.CurrentPeriodResults {
currentPeriodResultsMap[result.QueryName] = result
}
pastPeriodResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.PastPeriodResults {
pastPeriodResultsMap[result.QueryName] = result
}
currentSeasonResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.CurrentSeasonResults {
currentSeasonResultsMap[result.QueryName] = result
}
pastSeasonResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.PastSeasonResults {
pastSeasonResultsMap[result.QueryName] = result
}
past2SeasonResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.Past2SeasonResults {
past2SeasonResultsMap[result.QueryName] = result
}
past3SeasonResultsMap := make(map[string]*v3.Result)
for _, result := range anomalyQueryResults.Past3SeasonResults {
past3SeasonResultsMap[result.QueryName] = result
}
for _, result := range currentPeriodResultsMap {
funcs := req.Params.CompositeQuery.BuilderQueries[result.QueryName].Functions
var zScoreThreshold float64
for _, f := range funcs {
if f.Name == v3.FunctionNameAnomaly {
value, ok := f.NamedArgs["z_score_threshold"]
if ok {
zScoreThreshold = value.(float64)
} else {
zScoreThreshold = 3
}
break
}
}
pastPeriodResult, ok := pastPeriodResultsMap[result.QueryName]
if !ok {
continue
}
currentSeasonResult, ok := currentSeasonResultsMap[result.QueryName]
if !ok {
continue
}
pastSeasonResult, ok := pastSeasonResultsMap[result.QueryName]
if !ok {
continue
}
past2SeasonResult, ok := past2SeasonResultsMap[result.QueryName]
if !ok {
continue
}
past3SeasonResult, ok := past3SeasonResultsMap[result.QueryName]
if !ok {
continue
}
for _, series := range result.Series {
stdDev := p.getStdDev(series)
zap.L().Info("stdDev", zap.Float64("stdDev", stdDev), zap.Any("labels", series.Labels))
pastPeriodSeries := p.getMatchingSeries(pastPeriodResult, series)
currentSeasonSeries := p.getMatchingSeries(currentSeasonResult, series)
pastSeasonSeries := p.getMatchingSeries(pastSeasonResult, series)
past2SeasonSeries := p.getMatchingSeries(past2SeasonResult, series)
past3SeasonSeries := p.getMatchingSeries(past3SeasonResult, series)
prevSeriesAvg := p.getAvg(pastPeriodSeries)
currentSeasonSeriesAvg := p.getAvg(currentSeasonSeries)
pastSeasonSeriesAvg := p.getAvg(pastSeasonSeries)
past2SeasonSeriesAvg := p.getAvg(past2SeasonSeries)
past3SeasonSeriesAvg := p.getAvg(past3SeasonSeries)
zap.L().Info("getAvg", zap.Float64("prevSeriesAvg", prevSeriesAvg), zap.Float64("currentSeasonSeriesAvg", currentSeasonSeriesAvg), zap.Float64("pastSeasonSeriesAvg", pastSeasonSeriesAvg), zap.Float64("past2SeasonSeriesAvg", past2SeasonSeriesAvg), zap.Float64("past3SeasonSeriesAvg", past3SeasonSeriesAvg), zap.Any("labels", series.Labels))
predictedSeries := p.getPredictedSeries(
series,
pastPeriodSeries,
currentSeasonSeries,
pastSeasonSeries,
past2SeasonSeries,
past3SeasonSeries,
)
result.PredictedSeries = append(result.PredictedSeries, predictedSeries)
upperBoundSeries, lowerBoundSeries := p.getBounds(
series,
predictedSeries,
zScoreThreshold,
)
result.UpperBoundSeries = append(result.UpperBoundSeries, upperBoundSeries)
result.LowerBoundSeries = append(result.LowerBoundSeries, lowerBoundSeries)
anomalyScoreSeries := p.getAnomalyScores(
series,
pastPeriodSeries,
currentSeasonSeries,
pastSeasonSeries,
past2SeasonSeries,
past3SeasonSeries,
)
result.AnomalyScores = append(result.AnomalyScores, anomalyScoreSeries)
}
}
results := make([]*v3.Result, 0, len(currentPeriodResultsMap))
for _, result := range currentPeriodResultsMap {
results = append(results, result)
}
return &GetAnomaliesResponse{
Results: results,
}, nil
}

View File

@ -0,0 +1,43 @@
package anomaly
import (
"context"
querierV2 "go.signoz.io/signoz/pkg/query-service/app/querier/v2"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
)
type WeeklyProvider struct {
BaseSeasonalProvider
}
var _ BaseProvider = (*WeeklyProvider)(nil)
func (wp *WeeklyProvider) GetBaseSeasonalProvider() *BaseSeasonalProvider {
return &wp.BaseSeasonalProvider
}
func NewWeeklyProvider(opts ...GenericProviderOption[*WeeklyProvider]) *WeeklyProvider {
wp := &WeeklyProvider{
BaseSeasonalProvider: BaseSeasonalProvider{},
}
for _, opt := range opts {
opt(wp)
}
wp.querierV2 = querierV2.NewQuerier(querierV2.QuerierOptions{
Reader: wp.reader,
Cache: wp.cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: wp.fluxInterval,
FeatureLookup: wp.ff,
})
return wp
}
func (p *WeeklyProvider) GetAnomalies(ctx context.Context, req *GetAnomaliesRequest) (*GetAnomaliesResponse, error) {
req.Seasonality = SeasonalityWeekly
return p.getAnomalies(ctx, req)
}

View File

@ -38,7 +38,8 @@ type APIHandlerOptions struct {
Cache cache.Cache
Gateway *httputil.ReverseProxy
// Querier Influx Interval
FluxInterval time.Duration
FluxInterval time.Duration
UseLogsNewSchema bool
}
type APIHandler struct {
@ -63,6 +64,7 @@ func NewAPIHandler(opts APIHandlerOptions) (*APIHandler, error) {
LogsParsingPipelineController: opts.LogsParsingPipelineController,
Cache: opts.Cache,
FluxInterval: opts.FluxInterval,
UseLogsNewSchema: opts.UseLogsNewSchema,
})
if err != nil {
@ -175,6 +177,8 @@ func (ah *APIHandler) RegisterRoutes(router *mux.Router, am *baseapp.AuthMiddlew
am.ViewAccess(ah.listLicensesV2)).
Methods(http.MethodGet)
router.HandleFunc("/api/v4/query_range", am.ViewAccess(ah.queryRangeV4)).Methods(http.MethodPost)
// Gateway
router.PathPrefix(gateway.RoutePrefix).HandlerFunc(am.AdminAccess(ah.ServeGatewayHTTP))

View File

@ -0,0 +1,119 @@
package api
import (
"bytes"
"fmt"
"io"
"net/http"
"go.signoz.io/signoz/ee/query-service/anomaly"
baseapp "go.signoz.io/signoz/pkg/query-service/app"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
"go.signoz.io/signoz/pkg/query-service/model"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.uber.org/zap"
)
func (aH *APIHandler) queryRangeV4(w http.ResponseWriter, r *http.Request) {
bodyBytes, _ := io.ReadAll(r.Body)
r.Body = io.NopCloser(bytes.NewBuffer(bodyBytes))
queryRangeParams, apiErrorObj := baseapp.ParseQueryRangeParams(r)
if apiErrorObj != nil {
zap.L().Error("error parsing metric query range params", zap.Error(apiErrorObj.Err))
RespondError(w, apiErrorObj, nil)
return
}
queryRangeParams.Version = "v4"
// add temporality for each metric
temporalityErr := aH.PopulateTemporality(r.Context(), queryRangeParams)
if temporalityErr != nil {
zap.L().Error("Error while adding temporality for metrics", zap.Error(temporalityErr))
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: temporalityErr}, nil)
return
}
anomalyQueryExists := false
anomalyQuery := &v3.BuilderQuery{}
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
for _, query := range queryRangeParams.CompositeQuery.BuilderQueries {
for _, fn := range query.Functions {
if fn.Name == v3.FunctionNameAnomaly {
anomalyQueryExists = true
anomalyQuery = query
break
}
}
}
}
if anomalyQueryExists {
// ensure all queries have metric data source, and there should be only one anomaly query
for _, query := range queryRangeParams.CompositeQuery.BuilderQueries {
if query.DataSource != v3.DataSourceMetrics {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("all queries must have metric data source")}, nil)
return
}
}
// get the threshold, and seasonality from the anomaly query
var seasonality anomaly.Seasonality
for _, fn := range anomalyQuery.Functions {
if fn.Name == v3.FunctionNameAnomaly {
seasonalityStr, ok := fn.NamedArgs["seasonality"].(string)
if !ok {
seasonalityStr = "daily"
}
if seasonalityStr == "weekly" {
seasonality = anomaly.SeasonalityWeekly
} else if seasonalityStr == "daily" {
seasonality = anomaly.SeasonalityDaily
} else {
seasonality = anomaly.SeasonalityHourly
}
break
}
}
var provider anomaly.Provider
switch seasonality {
case anomaly.SeasonalityWeekly:
provider = anomaly.NewWeeklyProvider(
anomaly.WithCache[*anomaly.WeeklyProvider](aH.opts.Cache),
anomaly.WithKeyGenerator[*anomaly.WeeklyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.WeeklyProvider](aH.opts.DataConnector),
anomaly.WithFeatureLookup[*anomaly.WeeklyProvider](aH.opts.FeatureFlags),
)
case anomaly.SeasonalityDaily:
provider = anomaly.NewDailyProvider(
anomaly.WithCache[*anomaly.DailyProvider](aH.opts.Cache),
anomaly.WithKeyGenerator[*anomaly.DailyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.DailyProvider](aH.opts.DataConnector),
anomaly.WithFeatureLookup[*anomaly.DailyProvider](aH.opts.FeatureFlags),
)
case anomaly.SeasonalityHourly:
provider = anomaly.NewHourlyProvider(
anomaly.WithCache[*anomaly.HourlyProvider](aH.opts.Cache),
anomaly.WithKeyGenerator[*anomaly.HourlyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.HourlyProvider](aH.opts.DataConnector),
anomaly.WithFeatureLookup[*anomaly.HourlyProvider](aH.opts.FeatureFlags),
)
}
anomalies, err := provider.GetAnomalies(r.Context(), &anomaly.GetAnomaliesRequest{Params: queryRangeParams})
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: err}, nil)
return
}
uniqueResults := make(map[string]*v3.Result)
for _, anomaly := range anomalies.Results {
uniqueResults[anomaly.QueryName] = anomaly
uniqueResults[anomaly.QueryName].IsAnomaly = true
}
aH.Respond(w, uniqueResults)
} else {
r.Body = io.NopCloser(bytes.NewBuffer(bodyBytes))
aH.QueryRangeV4(w, r)
}
}

View File

@ -1,401 +0,0 @@
package db
import (
"context"
"crypto/md5"
"encoding/json"
"fmt"
"reflect"
"regexp"
"sort"
"strings"
"time"
"go.signoz.io/signoz/ee/query-service/model"
baseconst "go.signoz.io/signoz/pkg/query-service/constants"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/query-service/utils"
"go.uber.org/zap"
)
// GetMetricResultEE runs the query and returns list of time series
func (r *ClickhouseReader) GetMetricResultEE(ctx context.Context, query string) ([]*basemodel.Series, string, error) {
defer utils.Elapsed("GetMetricResult", nil)()
zap.L().Info("Executing metric result query: ", zap.String("query", query))
var hash string
// If getSubTreeSpans function is used in the clickhouse query
if strings.Contains(query, "getSubTreeSpans(") {
var err error
query, hash, err = r.getSubTreeSpansCustomFunction(ctx, query, hash)
if err == fmt.Errorf("no spans found for the given query") {
return nil, "", nil
}
if err != nil {
return nil, "", err
}
}
rows, err := r.conn.Query(ctx, query)
if err != nil {
zap.L().Error("Error in processing query", zap.Error(err))
return nil, "", fmt.Errorf("error in processing query")
}
var (
columnTypes = rows.ColumnTypes()
columnNames = rows.Columns()
vars = make([]interface{}, len(columnTypes))
)
for i := range columnTypes {
vars[i] = reflect.New(columnTypes[i].ScanType()).Interface()
}
// when group by is applied, each combination of cartesian product
// of attributes is separate series. each item in metricPointsMap
// represent a unique series.
metricPointsMap := make(map[string][]basemodel.MetricPoint)
// attribute key-value pairs for each group selection
attributesMap := make(map[string]map[string]string)
defer rows.Close()
for rows.Next() {
if err := rows.Scan(vars...); err != nil {
return nil, "", err
}
var groupBy []string
var metricPoint basemodel.MetricPoint
groupAttributes := make(map[string]string)
// Assuming that the end result row contains a timestamp, value and option labels
// Label key and value are both strings.
for idx, v := range vars {
colName := columnNames[idx]
switch v := v.(type) {
case *string:
// special case for returning all labels
if colName == "fullLabels" {
var metric map[string]string
err := json.Unmarshal([]byte(*v), &metric)
if err != nil {
return nil, "", err
}
for key, val := range metric {
groupBy = append(groupBy, val)
groupAttributes[key] = val
}
} else {
groupBy = append(groupBy, *v)
groupAttributes[colName] = *v
}
case *time.Time:
metricPoint.Timestamp = v.UnixMilli()
case *float64:
metricPoint.Value = *v
case **float64:
// ch seems to return this type when column is derived from
// SELECT count(*)/ SELECT count(*)
floatVal := *v
if floatVal != nil {
metricPoint.Value = *floatVal
}
case *float32:
float32Val := float32(*v)
metricPoint.Value = float64(float32Val)
case *uint8, *uint64, *uint16, *uint32:
if _, ok := baseconst.ReservedColumnTargetAliases[colName]; ok {
metricPoint.Value = float64(reflect.ValueOf(v).Elem().Uint())
} else {
groupBy = append(groupBy, fmt.Sprintf("%v", reflect.ValueOf(v).Elem().Uint()))
groupAttributes[colName] = fmt.Sprintf("%v", reflect.ValueOf(v).Elem().Uint())
}
case *int8, *int16, *int32, *int64:
if _, ok := baseconst.ReservedColumnTargetAliases[colName]; ok {
metricPoint.Value = float64(reflect.ValueOf(v).Elem().Int())
} else {
groupBy = append(groupBy, fmt.Sprintf("%v", reflect.ValueOf(v).Elem().Int()))
groupAttributes[colName] = fmt.Sprintf("%v", reflect.ValueOf(v).Elem().Int())
}
default:
zap.L().Error("invalid var found in metric builder query result", zap.Any("var", v), zap.String("colName", colName))
}
}
sort.Strings(groupBy)
key := strings.Join(groupBy, "")
attributesMap[key] = groupAttributes
metricPointsMap[key] = append(metricPointsMap[key], metricPoint)
}
var seriesList []*basemodel.Series
for key := range metricPointsMap {
points := metricPointsMap[key]
// first point in each series could be invalid since the
// aggregations are applied with point from prev series
if len(points) != 0 && len(points) > 1 {
points = points[1:]
}
attributes := attributesMap[key]
series := basemodel.Series{Labels: attributes, Points: points}
seriesList = append(seriesList, &series)
}
// err = r.conn.Exec(ctx, "DROP TEMPORARY TABLE IF EXISTS getSubTreeSpans"+hash)
// if err != nil {
// zap.L().Error("Error in dropping temporary table: ", err)
// return nil, err
// }
if hash == "" {
return seriesList, hash, nil
} else {
return seriesList, "getSubTreeSpans" + hash, nil
}
}
func (r *ClickhouseReader) getSubTreeSpansCustomFunction(ctx context.Context, query string, hash string) (string, string, error) {
zap.L().Debug("Executing getSubTreeSpans function")
// str1 := `select fromUnixTimestamp64Milli(intDiv( toUnixTimestamp64Milli ( timestamp ), 100) * 100) AS interval, toFloat64(count()) as count from (select timestamp, spanId, parentSpanId, durationNano from getSubTreeSpans(select * from signoz_traces.signoz_index_v2 where serviceName='frontend' and name='/driver.DriverService/FindNearest' and traceID='00000000000000004b0a863cb5ed7681') where name='FindDriverIDs' group by interval order by interval asc;`
// process the query to fetch subTree query
var subtreeInput string
query, subtreeInput, hash = processQuery(query, hash)
err := r.conn.Exec(ctx, "DROP TABLE IF EXISTS getSubTreeSpans"+hash)
if err != nil {
zap.L().Error("Error in dropping temporary table", zap.Error(err))
return query, hash, err
}
// Create temporary table to store the getSubTreeSpans() results
zap.L().Debug("Creating temporary table getSubTreeSpans", zap.String("hash", hash))
err = r.conn.Exec(ctx, "CREATE TABLE IF NOT EXISTS "+"getSubTreeSpans"+hash+" (timestamp DateTime64(9) CODEC(DoubleDelta, LZ4), traceID FixedString(32) CODEC(ZSTD(1)), spanID String CODEC(ZSTD(1)), parentSpanID String CODEC(ZSTD(1)), rootSpanID String CODEC(ZSTD(1)), serviceName LowCardinality(String) CODEC(ZSTD(1)), name LowCardinality(String) CODEC(ZSTD(1)), rootName LowCardinality(String) CODEC(ZSTD(1)), durationNano UInt64 CODEC(T64, ZSTD(1)), kind Int8 CODEC(T64, ZSTD(1)), tagMap Map(LowCardinality(String), String) CODEC(ZSTD(1)), events Array(String) CODEC(ZSTD(2))) ENGINE = MergeTree() ORDER BY (timestamp)")
if err != nil {
zap.L().Error("Error in creating temporary table", zap.Error(err))
return query, hash, err
}
var getSpansSubQueryDBResponses []model.GetSpansSubQueryDBResponse
getSpansSubQuery := subtreeInput
// Execute the subTree query
zap.L().Debug("Executing subTree query", zap.String("query", getSpansSubQuery))
err = r.conn.Select(ctx, &getSpansSubQueryDBResponses, getSpansSubQuery)
// zap.L().Info(getSpansSubQuery)
if err != nil {
zap.L().Error("Error in processing sql query", zap.Error(err))
return query, hash, fmt.Errorf("error in processing sql query")
}
var searchScanResponses []basemodel.SearchSpanDBResponseItem
// TODO : @ankit: I think the algorithm does not need to assume that subtrees are from the same TraceID. We can take this as an improvement later.
// Fetch all the spans from of same TraceID so that we can build subtree
modelQuery := fmt.Sprintf("SELECT timestamp, traceID, model FROM %s.%s WHERE traceID=$1", r.TraceDB, r.SpansTable)
if len(getSpansSubQueryDBResponses) == 0 {
return query, hash, fmt.Errorf("no spans found for the given query")
}
zap.L().Debug("Executing query to fetch all the spans from the same TraceID: ", zap.String("modelQuery", modelQuery))
err = r.conn.Select(ctx, &searchScanResponses, modelQuery, getSpansSubQueryDBResponses[0].TraceID)
if err != nil {
zap.L().Error("Error in processing sql query", zap.Error(err))
return query, hash, fmt.Errorf("error in processing sql query")
}
// Process model to fetch the spans
zap.L().Debug("Processing model to fetch the spans")
searchSpanResponses := []basemodel.SearchSpanResponseItem{}
for _, item := range searchScanResponses {
var jsonItem basemodel.SearchSpanResponseItem
json.Unmarshal([]byte(item.Model), &jsonItem)
jsonItem.TimeUnixNano = uint64(item.Timestamp.UnixNano())
if jsonItem.Events == nil {
jsonItem.Events = []string{}
}
searchSpanResponses = append(searchSpanResponses, jsonItem)
}
// Build the subtree and store all the subtree spans in temporary table getSubTreeSpans+hash
// Use map to store pointer to the spans to avoid duplicates and save memory
zap.L().Debug("Building the subtree to store all the subtree spans in temporary table getSubTreeSpans", zap.String("hash", hash))
treeSearchResponse, err := getSubTreeAlgorithm(searchSpanResponses, getSpansSubQueryDBResponses)
if err != nil {
zap.L().Error("Error in getSubTreeAlgorithm function", zap.Error(err))
return query, hash, err
}
zap.L().Debug("Preparing batch to store subtree spans in temporary table getSubTreeSpans", zap.String("hash", hash))
statement, err := r.conn.PrepareBatch(context.Background(), fmt.Sprintf("INSERT INTO getSubTreeSpans"+hash))
if err != nil {
zap.L().Error("Error in preparing batch statement", zap.Error(err))
return query, hash, err
}
for _, span := range treeSearchResponse {
var parentID string
if len(span.References) > 0 && span.References[0].RefType == "CHILD_OF" {
parentID = span.References[0].SpanId
}
err = statement.Append(
time.Unix(0, int64(span.TimeUnixNano)),
span.TraceID,
span.SpanID,
parentID,
span.RootSpanID,
span.ServiceName,
span.Name,
span.RootName,
uint64(span.DurationNano),
int8(span.Kind),
span.TagMap,
span.Events,
)
if err != nil {
zap.L().Error("Error in processing sql query", zap.Error(err))
return query, hash, err
}
}
zap.L().Debug("Inserting the subtree spans in temporary table getSubTreeSpans", zap.String("hash", hash))
err = statement.Send()
if err != nil {
zap.L().Error("Error in sending statement", zap.Error(err))
return query, hash, err
}
return query, hash, nil
}
//lint:ignore SA4009 return hash is feeded to the query
func processQuery(query string, hash string) (string, string, string) {
re3 := regexp.MustCompile(`getSubTreeSpans`)
submatchall3 := re3.FindAllStringIndex(query, -1)
getSubtreeSpansMatchIndex := submatchall3[0][1]
query2countParenthesis := query[getSubtreeSpansMatchIndex:]
sqlCompleteIndex := 0
countParenthesisImbalance := 0
for i, char := range query2countParenthesis {
if string(char) == "(" {
countParenthesisImbalance += 1
}
if string(char) == ")" {
countParenthesisImbalance -= 1
}
if countParenthesisImbalance == 0 {
sqlCompleteIndex = i
break
}
}
subtreeInput := query2countParenthesis[1:sqlCompleteIndex]
// hash the subtreeInput
hmd5 := md5.Sum([]byte(subtreeInput))
hash = fmt.Sprintf("%x", hmd5)
// Reformat the query to use the getSubTreeSpans function
query = query[:getSubtreeSpansMatchIndex] + hash + " " + query2countParenthesis[sqlCompleteIndex+1:]
return query, subtreeInput, hash
}
// getSubTreeAlgorithm is an algorithm to build the subtrees of the spans and return the list of spans
func getSubTreeAlgorithm(payload []basemodel.SearchSpanResponseItem, getSpansSubQueryDBResponses []model.GetSpansSubQueryDBResponse) (map[string]*basemodel.SearchSpanResponseItem, error) {
var spans []*model.SpanForTraceDetails
for _, spanItem := range payload {
var parentID string
if len(spanItem.References) > 0 && spanItem.References[0].RefType == "CHILD_OF" {
parentID = spanItem.References[0].SpanId
}
span := &model.SpanForTraceDetails{
TimeUnixNano: spanItem.TimeUnixNano,
SpanID: spanItem.SpanID,
TraceID: spanItem.TraceID,
ServiceName: spanItem.ServiceName,
Name: spanItem.Name,
Kind: spanItem.Kind,
DurationNano: spanItem.DurationNano,
TagMap: spanItem.TagMap,
ParentID: parentID,
Events: spanItem.Events,
HasError: spanItem.HasError,
}
spans = append(spans, span)
}
zap.L().Debug("Building Tree")
roots, err := buildSpanTrees(&spans)
if err != nil {
return nil, err
}
searchSpansResult := make(map[string]*basemodel.SearchSpanResponseItem)
// Every span which was fetched from getSubTree Input SQL query is considered root
// For each root, get the subtree spans
for _, getSpansSubQueryDBResponse := range getSpansSubQueryDBResponses {
targetSpan := &model.SpanForTraceDetails{}
// zap.L().Debug("Building tree for span id: " + getSpansSubQueryDBResponse.SpanID + " " + strconv.Itoa(i+1) + " of " + strconv.Itoa(len(getSpansSubQueryDBResponses)))
// Search target span object in the tree
for _, root := range roots {
targetSpan, err = breadthFirstSearch(root, getSpansSubQueryDBResponse.SpanID)
if targetSpan != nil {
break
}
if err != nil {
zap.L().Error("Error during BreadthFirstSearch()", zap.Error(err))
return nil, err
}
}
if targetSpan == nil {
return nil, nil
}
// Build subtree for the target span
// Mark the target span as root by setting parent ID as empty string
targetSpan.ParentID = ""
preParents := []*model.SpanForTraceDetails{targetSpan}
children := []*model.SpanForTraceDetails{}
// Get the subtree child spans
for i := 0; len(preParents) != 0; i++ {
parents := []*model.SpanForTraceDetails{}
for _, parent := range preParents {
children = append(children, parent.Children...)
parents = append(parents, parent.Children...)
}
preParents = parents
}
resultSpans := children
// Add the target span to the result spans
resultSpans = append(resultSpans, targetSpan)
for _, item := range resultSpans {
references := []basemodel.OtelSpanRef{
{
TraceId: item.TraceID,
SpanId: item.ParentID,
RefType: "CHILD_OF",
},
}
if item.Events == nil {
item.Events = []string{}
}
searchSpansResult[item.SpanID] = &basemodel.SearchSpanResponseItem{
TimeUnixNano: item.TimeUnixNano,
SpanID: item.SpanID,
TraceID: item.TraceID,
ServiceName: item.ServiceName,
Name: item.Name,
Kind: item.Kind,
References: references,
DurationNano: item.DurationNano,
TagMap: item.TagMap,
Events: item.Events,
HasError: item.HasError,
RootSpanID: getSpansSubQueryDBResponse.SpanID,
RootName: targetSpan.Name,
}
}
}
return searchSpansResult, nil
}

View File

@ -25,8 +25,9 @@ func NewDataConnector(
maxOpenConns int,
dialTimeout time.Duration,
cluster string,
useLogsNewSchema bool,
) *ClickhouseReader {
ch := basechr.NewReader(localDB, promConfigPath, lm, maxIdleConns, maxOpenConns, dialTimeout, cluster)
ch := basechr.NewReader(localDB, promConfigPath, lm, maxIdleConns, maxOpenConns, dialTimeout, cluster, useLogsNewSchema)
return &ClickhouseReader{
conn: ch.GetConn(),
appdb: localDB,

View File

@ -77,6 +77,7 @@ type ServerOptions struct {
FluxInterval string
Cluster string
GatewayUrl string
UseLogsNewSchema bool
}
// Server runs HTTP api service
@ -154,6 +155,7 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
serverOptions.MaxOpenConns,
serverOptions.DialTimeout,
serverOptions.Cluster,
serverOptions.UseLogsNewSchema,
)
go qb.Start(readerReady)
reader = qb
@ -168,6 +170,14 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
return nil, err
}
}
var c cache.Cache
if serverOptions.CacheConfigPath != "" {
cacheOpts, err := cache.LoadFromYAMLCacheConfigFile(serverOptions.CacheConfigPath)
if err != nil {
return nil, err
}
c = cache.NewCache(cacheOpts)
}
<-readerReady
rm, err := makeRulesManager(serverOptions.PromConfigPath,
@ -175,8 +185,11 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
serverOptions.RuleRepoURL,
localDB,
reader,
c,
serverOptions.DisableRules,
lm)
lm,
serverOptions.UseLogsNewSchema,
)
if err != nil {
return nil, err
@ -233,15 +246,6 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
telemetry.GetInstance().SetReader(reader)
telemetry.GetInstance().SetSaasOperator(constants.SaasSegmentKey)
var c cache.Cache
if serverOptions.CacheConfigPath != "" {
cacheOpts, err := cache.LoadFromYAMLCacheConfigFile(serverOptions.CacheConfigPath)
if err != nil {
return nil, err
}
c = cache.NewCache(cacheOpts)
}
fluxInterval, err := time.ParseDuration(serverOptions.FluxInterval)
if err != nil {
@ -265,6 +269,7 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
Cache: c,
FluxInterval: fluxInterval,
Gateway: gatewayProxy,
UseLogsNewSchema: serverOptions.UseLogsNewSchema,
}
apiHandler, err := api.NewAPIHandler(apiOpts)
@ -727,8 +732,10 @@ func makeRulesManager(
ruleRepoURL string,
db *sqlx.DB,
ch baseint.Reader,
cache cache.Cache,
disableRules bool,
fm baseint.FeatureLookup) (*baserules.Manager, error) {
fm baseint.FeatureLookup,
useLogsNewSchema bool) (*baserules.Manager, error) {
// create engine
pqle, err := pqle.FromConfigPath(promConfigPath)
@ -754,9 +761,11 @@ func makeRulesManager(
DisableRules: disableRules,
FeatureFlags: fm,
Reader: ch,
Cache: cache,
EvalDelay: baseconst.GetEvalDelay(),
PrepareTaskFunc: rules.PrepareTaskFunc,
PrepareTaskFunc: rules.PrepareTaskFunc,
UseLogsNewSchema: useLogsNewSchema,
}
// create Manager

View File

@ -87,6 +87,7 @@ func main() {
var ruleRepoURL string
var cluster string
var useLogsNewSchema bool
var cacheConfigPath, fluxInterval string
var enableQueryServiceLogOTLPExport bool
var preferSpanMetrics bool
@ -96,6 +97,7 @@ func main() {
var dialTimeout time.Duration
var gatewayUrl string
flag.BoolVar(&useLogsNewSchema, "use-logs-new-schema", false, "use logs_v2 schema for logs")
flag.StringVar(&promConfigPath, "config", "./config/prometheus.yml", "(prometheus config to read metrics)")
flag.StringVar(&skipTopLvlOpsPath, "skip-top-level-ops", "", "(config file to skip top level operations)")
flag.BoolVar(&disableRules, "rules.disable", false, "(disable rule evaluation)")
@ -134,6 +136,7 @@ func main() {
FluxInterval: fluxInterval,
Cluster: cluster,
GatewayUrl: gatewayUrl,
UseLogsNewSchema: useLogsNewSchema,
}
// Read the jwt secret key

View File

@ -13,7 +13,6 @@ const Onboarding = "ONBOARDING"
const ChatSupport = "CHAT_SUPPORT"
const Gateway = "GATEWAY"
const PremiumSupport = "PREMIUM_SUPPORT"
const QueryBuilderSearchV2 = "QUERY_BUILDER_SEARCH_V2"
var BasicPlan = basemodel.FeatureSet{
basemodel.Feature{
@ -129,7 +128,7 @@ var BasicPlan = basemodel.FeatureSet{
Route: "",
},
basemodel.Feature{
Name: QueryBuilderSearchV2,
Name: basemodel.AnomalyDetection,
Active: false,
Usage: 0,
UsageLimit: -1,
@ -244,8 +243,8 @@ var ProPlan = basemodel.FeatureSet{
Route: "",
},
basemodel.Feature{
Name: QueryBuilderSearchV2,
Active: false,
Name: basemodel.AnomalyDetection,
Active: true,
Usage: 0,
UsageLimit: -1,
Route: "",
@ -373,8 +372,8 @@ var EnterprisePlan = basemodel.FeatureSet{
Route: "",
},
basemodel.Feature{
Name: QueryBuilderSearchV2,
Active: false,
Name: basemodel.AnomalyDetection,
Active: true,
Usage: 0,
UsageLimit: -1,
Route: "",

View File

@ -0,0 +1,393 @@
package rules
import (
"context"
"encoding/json"
"fmt"
"math"
"strings"
"sync"
"time"
"go.uber.org/zap"
"go.signoz.io/signoz/ee/query-service/anomaly"
"go.signoz.io/signoz/pkg/query-service/cache"
"go.signoz.io/signoz/pkg/query-service/common"
"go.signoz.io/signoz/pkg/query-service/model"
querierV2 "go.signoz.io/signoz/pkg/query-service/app/querier/v2"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
"go.signoz.io/signoz/pkg/query-service/interfaces"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/utils/labels"
"go.signoz.io/signoz/pkg/query-service/utils/times"
"go.signoz.io/signoz/pkg/query-service/utils/timestamp"
"go.signoz.io/signoz/pkg/query-service/formatter"
baserules "go.signoz.io/signoz/pkg/query-service/rules"
yaml "gopkg.in/yaml.v2"
)
const (
RuleTypeAnomaly = "anomaly_rule"
)
type AnomalyRule struct {
*baserules.BaseRule
mtx sync.Mutex
reader interfaces.Reader
// querierV2 is used for alerts created after the introduction of new metrics query builder
querierV2 interfaces.Querier
provider anomaly.Provider
seasonality anomaly.Seasonality
}
func NewAnomalyRule(
id string,
p *baserules.PostableRule,
featureFlags interfaces.FeatureLookup,
reader interfaces.Reader,
cache cache.Cache,
opts ...baserules.RuleOption,
) (*AnomalyRule, error) {
zap.L().Info("creating new AnomalyRule", zap.String("id", id), zap.Any("opts", opts))
baseRule, err := baserules.NewBaseRule(id, p, reader, opts...)
if err != nil {
return nil, err
}
t := AnomalyRule{
BaseRule: baseRule,
}
switch strings.ToLower(p.RuleCondition.Seasonality) {
case "hourly":
t.seasonality = anomaly.SeasonalityHourly
case "daily":
t.seasonality = anomaly.SeasonalityDaily
case "weekly":
t.seasonality = anomaly.SeasonalityWeekly
default:
t.seasonality = anomaly.SeasonalityDaily
}
zap.L().Info("using seasonality", zap.String("seasonality", t.seasonality.String()))
querierOptsV2 := querierV2.QuerierOptions{
Reader: reader,
Cache: cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FeatureLookup: featureFlags,
}
t.querierV2 = querierV2.NewQuerier(querierOptsV2)
t.reader = reader
if t.seasonality == anomaly.SeasonalityHourly {
t.provider = anomaly.NewHourlyProvider(
anomaly.WithCache[*anomaly.HourlyProvider](cache),
anomaly.WithKeyGenerator[*anomaly.HourlyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.HourlyProvider](reader),
anomaly.WithFeatureLookup[*anomaly.HourlyProvider](featureFlags),
)
} else if t.seasonality == anomaly.SeasonalityDaily {
t.provider = anomaly.NewDailyProvider(
anomaly.WithCache[*anomaly.DailyProvider](cache),
anomaly.WithKeyGenerator[*anomaly.DailyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.DailyProvider](reader),
anomaly.WithFeatureLookup[*anomaly.DailyProvider](featureFlags),
)
} else if t.seasonality == anomaly.SeasonalityWeekly {
t.provider = anomaly.NewWeeklyProvider(
anomaly.WithCache[*anomaly.WeeklyProvider](cache),
anomaly.WithKeyGenerator[*anomaly.WeeklyProvider](queryBuilder.NewKeyGenerator()),
anomaly.WithReader[*anomaly.WeeklyProvider](reader),
anomaly.WithFeatureLookup[*anomaly.WeeklyProvider](featureFlags),
)
}
return &t, nil
}
func (r *AnomalyRule) Type() baserules.RuleType {
return RuleTypeAnomaly
}
func (r *AnomalyRule) prepareQueryRange(ts time.Time) (*v3.QueryRangeParamsV3, error) {
zap.L().Info("prepareQueryRange", zap.Int64("ts", ts.UnixMilli()), zap.Int64("evalWindow", r.EvalWindow().Milliseconds()), zap.Int64("evalDelay", r.EvalDelay().Milliseconds()))
start := ts.Add(-time.Duration(r.EvalWindow())).UnixMilli()
end := ts.UnixMilli()
if r.EvalDelay() > 0 {
start = start - int64(r.EvalDelay().Milliseconds())
end = end - int64(r.EvalDelay().Milliseconds())
}
// round to minute otherwise we could potentially miss data
start = start - (start % (60 * 1000))
end = end - (end % (60 * 1000))
compositeQuery := r.Condition().CompositeQuery
if compositeQuery.PanelType != v3.PanelTypeGraph {
compositeQuery.PanelType = v3.PanelTypeGraph
}
// default mode
return &v3.QueryRangeParamsV3{
Start: start,
End: end,
Step: int64(math.Max(float64(common.MinAllowedStepInterval(start, end)), 60)),
CompositeQuery: compositeQuery,
Variables: make(map[string]interface{}, 0),
NoCache: false,
}, nil
}
func (r *AnomalyRule) GetSelectedQuery() string {
return r.Condition().GetSelectedQueryName()
}
func (r *AnomalyRule) buildAndRunQuery(ctx context.Context, ts time.Time) (baserules.Vector, error) {
params, err := r.prepareQueryRange(ts)
if err != nil {
return nil, err
}
err = r.PopulateTemporality(ctx, params)
if err != nil {
return nil, fmt.Errorf("internal error while setting temporality")
}
anomalies, err := r.provider.GetAnomalies(ctx, &anomaly.GetAnomaliesRequest{
Params: params,
Seasonality: r.seasonality,
})
if err != nil {
return nil, err
}
var queryResult *v3.Result
for _, result := range anomalies.Results {
if result.QueryName == r.GetSelectedQuery() {
queryResult = result
break
}
}
var resultVector baserules.Vector
scoresJSON, _ := json.Marshal(queryResult.AnomalyScores)
zap.L().Info("anomaly scores", zap.String("scores", string(scoresJSON)))
for _, series := range queryResult.AnomalyScores {
smpl, shouldAlert := r.ShouldAlert(*series)
if shouldAlert {
resultVector = append(resultVector, smpl)
}
}
return resultVector, nil
}
func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (interface{}, error) {
prevState := r.State()
valueFormatter := formatter.FromUnit(r.Unit())
res, err := r.buildAndRunQuery(ctx, ts)
if err != nil {
return nil, err
}
r.mtx.Lock()
defer r.mtx.Unlock()
resultFPs := map[uint64]struct{}{}
var alerts = make(map[uint64]*baserules.Alert, len(res))
for _, smpl := range res {
l := make(map[string]string, len(smpl.Metric))
for _, lbl := range smpl.Metric {
l[lbl.Name] = lbl.Value
}
value := valueFormatter.Format(smpl.V, r.Unit())
threshold := valueFormatter.Format(r.TargetVal(), r.Unit())
zap.L().Debug("Alert template data for rule", zap.String("name", r.Name()), zap.String("formatter", valueFormatter.Name()), zap.String("value", value), zap.String("threshold", threshold))
tmplData := baserules.AlertTemplateData(l, value, threshold)
// Inject some convenience variables that are easier to remember for users
// who are not used to Go's templating system.
defs := "{{$labels := .Labels}}{{$value := .Value}}{{$threshold := .Threshold}}"
// utility function to apply go template on labels and annotations
expand := func(text string) string {
tmpl := baserules.NewTemplateExpander(
ctx,
defs+text,
"__alert_"+r.Name(),
tmplData,
times.Time(timestamp.FromTime(ts)),
nil,
)
result, err := tmpl.Expand()
if err != nil {
result = fmt.Sprintf("<error expanding template: %s>", err)
zap.L().Error("Expanding alert template failed", zap.Error(err), zap.Any("data", tmplData))
}
return result
}
lb := labels.NewBuilder(smpl.Metric).Del(labels.MetricNameLabel).Del(labels.TemporalityLabel)
resultLabels := labels.NewBuilder(smpl.MetricOrig).Del(labels.MetricNameLabel).Del(labels.TemporalityLabel).Labels()
for name, value := range r.Labels().Map() {
lb.Set(name, expand(value))
}
lb.Set(labels.AlertNameLabel, r.Name())
lb.Set(labels.AlertRuleIdLabel, r.ID())
lb.Set(labels.RuleSourceLabel, r.GeneratorURL())
annotations := make(labels.Labels, 0, len(r.Annotations().Map()))
for name, value := range r.Annotations().Map() {
annotations = append(annotations, labels.Label{Name: common.NormalizeLabelName(name), Value: expand(value)})
}
if smpl.IsMissing {
lb.Set(labels.AlertNameLabel, "[No data] "+r.Name())
}
lbs := lb.Labels()
h := lbs.Hash()
resultFPs[h] = struct{}{}
if _, ok := alerts[h]; ok {
zap.L().Error("the alert query returns duplicate records", zap.String("ruleid", r.ID()), zap.Any("alert", alerts[h]))
err = fmt.Errorf("duplicate alert found, vector contains metrics with the same labelset after applying alert labels")
return nil, err
}
alerts[h] = &baserules.Alert{
Labels: lbs,
QueryResultLables: resultLabels,
Annotations: annotations,
ActiveAt: ts,
State: model.StatePending,
Value: smpl.V,
GeneratorURL: r.GeneratorURL(),
Receivers: r.PreferredChannels(),
Missing: smpl.IsMissing,
}
}
zap.L().Info("number of alerts found", zap.String("name", r.Name()), zap.Int("count", len(alerts)))
// alerts[h] is ready, add or update active list now
for h, a := range alerts {
// Check whether we already have alerting state for the identifying label set.
// Update the last value and annotations if so, create a new alert entry otherwise.
if alert, ok := r.Active[h]; ok && alert.State != model.StateInactive {
alert.Value = a.Value
alert.Annotations = a.Annotations
alert.Receivers = r.PreferredChannels()
continue
}
r.Active[h] = a
}
itemsToAdd := []model.RuleStateHistory{}
// Check if any pending alerts should be removed or fire now. Write out alert timeseries.
for fp, a := range r.Active {
labelsJSON, err := json.Marshal(a.QueryResultLables)
if err != nil {
zap.L().Error("error marshaling labels", zap.Error(err), zap.Any("labels", a.Labels))
}
if _, ok := resultFPs[fp]; !ok {
// If the alert was previously firing, keep it around for a given
// retention time so it is reported as resolved to the AlertManager.
if a.State == model.StatePending || (!a.ResolvedAt.IsZero() && ts.Sub(a.ResolvedAt) > baserules.ResolvedRetention) {
delete(r.Active, fp)
}
if a.State != model.StateInactive {
a.State = model.StateInactive
a.ResolvedAt = ts
itemsToAdd = append(itemsToAdd, model.RuleStateHistory{
RuleID: r.ID(),
RuleName: r.Name(),
State: model.StateInactive,
StateChanged: true,
UnixMilli: ts.UnixMilli(),
Labels: model.LabelsString(labelsJSON),
Fingerprint: a.QueryResultLables.Hash(),
Value: a.Value,
})
}
continue
}
if a.State == model.StatePending && ts.Sub(a.ActiveAt) >= r.HoldDuration() {
a.State = model.StateFiring
a.FiredAt = ts
state := model.StateFiring
if a.Missing {
state = model.StateNoData
}
itemsToAdd = append(itemsToAdd, model.RuleStateHistory{
RuleID: r.ID(),
RuleName: r.Name(),
State: state,
StateChanged: true,
UnixMilli: ts.UnixMilli(),
Labels: model.LabelsString(labelsJSON),
Fingerprint: a.QueryResultLables.Hash(),
Value: a.Value,
})
}
}
currentState := r.State()
overallStateChanged := currentState != prevState
for idx, item := range itemsToAdd {
item.OverallStateChanged = overallStateChanged
item.OverallState = currentState
itemsToAdd[idx] = item
}
r.RecordRuleStateHistory(ctx, prevState, currentState, itemsToAdd)
return len(r.Active), nil
}
func (r *AnomalyRule) String() string {
ar := baserules.PostableRule{
AlertName: r.Name(),
RuleCondition: r.Condition(),
EvalWindow: baserules.Duration(r.EvalWindow()),
Labels: r.Labels().Map(),
Annotations: r.Annotations().Map(),
PreferredChannels: r.PreferredChannels(),
}
byt, err := yaml.Marshal(ar)
if err != nil {
return fmt.Sprintf("error marshaling alerting rule: %s", err.Error())
}
return string(byt)
}

View File

@ -20,6 +20,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
opts.Rule,
opts.FF,
opts.Reader,
opts.UseLogsNewSchema,
baserules.WithEvalDelay(opts.ManagerOpts.EvalDelay),
)
@ -52,6 +53,25 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
// create promql rule task for evalution
task = newTask(baserules.TaskTypeProm, opts.TaskName, time.Duration(opts.Rule.Frequency), rules, opts.ManagerOpts, opts.NotifyFunc, opts.RuleDB)
} else if opts.Rule.RuleType == baserules.RuleTypeAnomaly {
// create anomaly rule
ar, err := NewAnomalyRule(
ruleId,
opts.Rule,
opts.FF,
opts.Reader,
opts.Cache,
baserules.WithEvalDelay(opts.ManagerOpts.EvalDelay),
)
if err != nil {
return task, err
}
rules = append(rules, ar)
// create anomaly rule task for evalution
task = newTask(baserules.TaskTypeCh, opts.TaskName, time.Duration(opts.Rule.Frequency), rules, opts.ManagerOpts, opts.NotifyFunc, opts.RuleDB)
} else {
return nil, fmt.Errorf("unsupported rule type. Supported types: %s, %s", baserules.RuleTypeProm, baserules.RuleTypeThreshold)
}

View File

@ -207,7 +207,6 @@
"eslint-plugin-sonarjs": "^0.12.0",
"husky": "^7.0.4",
"is-ci": "^3.0.1",
"jest-playwright-preset": "^1.7.2",
"jest-styled-components": "^7.0.8",
"lint-staged": "^12.5.0",
"msw": "1.3.2",

View File

@ -53,6 +53,7 @@
"option_atleastonce": "at least once",
"option_onaverage": "on average",
"option_intotal": "in total",
"option_last": "last",
"option_above": "above",
"option_below": "below",
"option_equal": "is equal to",

View File

@ -40,6 +40,7 @@
"option_atleastonce": "at least once",
"option_onaverage": "on average",
"option_intotal": "in total",
"option_last": "last",
"option_above": "above",
"option_below": "below",
"option_equal": "is equal to",

View File

@ -1,3 +1,3 @@
{
"rps_over_100": "You are sending data at more than 100 RPS, your ingestion may be rate limited. Please reach out to us via Intercom support."
"rps_over_100": "You are sending data at more than 100 RPS, your ingestion may be rate limited. Please reach out to us via Intercom support or "
}

View File

@ -53,6 +53,7 @@
"option_atleastonce": "at least once",
"option_onaverage": "on average",
"option_intotal": "in total",
"option_last": "last",
"option_above": "above",
"option_below": "below",
"option_equal": "is equal to",

View File

@ -40,6 +40,7 @@
"option_atleastonce": "at least once",
"option_onaverage": "on average",
"option_intotal": "in total",
"option_last": "last",
"option_above": "above",
"option_below": "below",
"option_equal": "is equal to",

View File

@ -1,3 +1,3 @@
{
"rps_over_100": "You are sending data at more than 100 RPS, your ingestion may be rate limited. Please reach out to us via Intercom support."
"rps_over_100": "You are sending data at more than 100 RPS, your ingestion may be rate limited. Please reach out to us via Intercom support or "
}

View File

@ -137,7 +137,6 @@ function App(): JSX.Element {
window.analytics.identify(email, sanitizedIdentifyPayload);
window.analytics.group(domain, groupTraits);
window.clarity('identify', email, name);
posthog?.identify(email, {
email,

View File

@ -12,6 +12,20 @@ beforeAll(() => {
matchMedia();
});
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
jest.mock('react-dnd', () => ({
useDrop: jest.fn().mockImplementation(() => [jest.fn(), jest.fn(), jest.fn()]),
useDrag: jest.fn().mockImplementation(() => [jest.fn(), jest.fn(), jest.fn()]),

View File

@ -139,6 +139,7 @@ export const getGraphOptions = (
},
scales: {
x: {
stacked: isStacked,
grid: {
display: true,
color: getGridColor(),
@ -165,6 +166,7 @@ export const getGraphOptions = (
ticks: { color: getAxisLabelColor(currentTheme) },
},
y: {
stacked: isStacked,
display: true,
grid: {
display: true,
@ -178,9 +180,6 @@ export const getGraphOptions = (
},
},
},
stacked: {
display: isStacked === undefined ? false : 'auto',
},
},
elements: {
line: {

View File

@ -2,6 +2,14 @@ export const VIEW_TYPES = {
OVERVIEW: 'OVERVIEW',
JSON: 'JSON',
CONTEXT: 'CONTEXT',
INFRAMETRICS: 'INFRAMETRICS',
} as const;
export type VIEWS = typeof VIEW_TYPES[keyof typeof VIEW_TYPES];
export const RESOURCE_KEYS = {
CLUSTER_NAME: 'k8s.cluster.name',
POD_NAME: 'k8s.pod.name',
NODE_NAME: 'k8s.node.name',
HOST_NAME: 'host.name',
} as const;

View File

@ -9,6 +9,7 @@ import cx from 'classnames';
import { LogType } from 'components/Logs/LogStateIndicator/LogStateIndicator';
import { LOCALSTORAGE } from 'constants/localStorage';
import ContextView from 'container/LogDetailedView/ContextView/ContextView';
import InfraMetrics from 'container/LogDetailedView/InfraMetrics/InfraMetrics';
import JSONView from 'container/LogDetailedView/JsonView';
import Overview from 'container/LogDetailedView/Overview';
import {
@ -22,6 +23,7 @@ import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useNotifications } from 'hooks/useNotifications';
import {
BarChart2,
Braces,
Copy,
Filter,
@ -36,7 +38,7 @@ import { Query, TagFilter } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource, StringOperators } from 'types/common/queryBuilder';
import { FORBID_DOM_PURIFY_TAGS } from 'utils/app';
import { VIEW_TYPES, VIEWS } from './constants';
import { RESOURCE_KEYS, VIEW_TYPES, VIEWS } from './constants';
import { LogDetailProps } from './LogDetail.interfaces';
import QueryBuilderSearchWrapper from './QueryBuilderSearchWrapper';
@ -192,6 +194,17 @@ function LogDetail({
Context
</div>
</Radio.Button>
<Radio.Button
className={
selectedView === VIEW_TYPES.INFRAMETRICS ? 'selected_view tab' : 'tab'
}
value={VIEW_TYPES.INFRAMETRICS}
>
<div className="view-title">
<BarChart2 size={14} />
Metrics
</div>
</Radio.Button>
</Radio.Group>
{selectedView === VIEW_TYPES.JSON && (
@ -246,6 +259,15 @@ function LogDetail({
isEdit={isEdit}
/>
)}
{selectedView === VIEW_TYPES.INFRAMETRICS && (
<InfraMetrics
clusterName={log.resources_string?.[RESOURCE_KEYS.CLUSTER_NAME] || ''}
podName={log.resources_string?.[RESOURCE_KEYS.POD_NAME] || ''}
nodeName={log.resources_string?.[RESOURCE_KEYS.NODE_NAME] || ''}
hostName={log.resources_string?.[RESOURCE_KEYS.HOST_NAME] || ''}
logLineTimestamp={log.timestamp.toString()}
/>
)}
</Drawer>
);
}

View File

@ -15,7 +15,7 @@ function AddToQueryHOC({
}: AddToQueryHOCProps): JSX.Element {
const handleQueryAdd = (event: MouseEvent<HTMLDivElement>): void => {
event.stopPropagation();
onAddToQuery(fieldKey, fieldValue, OPERATORS.IN);
onAddToQuery(fieldKey, fieldValue, OPERATORS['=']);
};
const popOverContent = useMemo(() => <span>Add to query: {fieldKey}</span>, [

View File

@ -22,26 +22,21 @@
}
&.INFO {
background-color: var(--bg-slate-400);
background-color: var(--bg-robin-500);
}
&.WARNING,
&.WARN {
background-color: var(--bg-amber-500);
}
&.ERROR {
background-color: var(--bg-cherry-500);
}
&.TRACE {
background-color: var(--bg-robin-300);
background-color: var(--bg-forest-400);
}
&.DEBUG {
background-color: var(--bg-forest-500);
background-color: var(--bg-aqua-500);
}
&.FATAL {
background-color: var(--bg-sakura-500);
}

View File

@ -26,17 +26,14 @@ function HorizontalTimelineGraph({
return [[], []];
}
// add a first and last entry to make sure the graph displays all the data
const FIVE_MINUTES_IN_SECONDS = 300;
// add an entry for the end time of the last entry to make sure the graph displays all the data
const timestamps = [
data[0].start / 1000 - FIVE_MINUTES_IN_SECONDS, // 5 minutes before the first entry
...data.map((item) => item.start / 1000),
data[data.length - 1].end / 1000, // end value of last entry
];
const states = [
ALERT_STATUS[data[0].state], // Same state as the first entry
...data.map((item) => ALERT_STATUS[item.state]),
ALERT_STATUS[data[data.length - 1].state], // Same state as the last entry
];

View File

@ -50,6 +50,13 @@
align-items: center;
}
}
.billing-update-note {
text-align: left;
font-size: 13px;
color: var(--bg-vanilla-200);
margin-top: 16px;
}
}
.ant-skeleton.ant-skeleton-element.ant-skeleton-active {
@ -75,5 +82,9 @@
}
}
}
.billing-update-note {
color: var(--bg-ink-200);
}
}
}

View File

@ -348,7 +348,12 @@ export default function BillingContainer(): JSX.Element {
const BillingUsageGraphCallback = useCallback(
() =>
!isLoading && !isFetchingBillingData ? (
<BillingUsageGraph data={apiResponse} billAmount={billAmount} />
<>
<BillingUsageGraph data={apiResponse} billAmount={billAmount} />
<div className="billing-update-note">
Note: Billing metrics are updated once every 24 hours.
</div>
</>
) : (
<Card className="empty-graph-card" bordered={false}>
<Spinner size="large" tip="Loading..." height="35vh" />

View File

@ -65,7 +65,7 @@ export const logAlertDefaults: AlertDef = {
chQueries: {
A: {
name: 'A',
query: `select \ntoStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 30 MINUTE) AS interval, \ntoFloat64(count()) as value \nFROM signoz_logs.distributed_logs \nWHERE timestamp BETWEEN {{.start_timestamp_nano}} AND {{.end_timestamp_nano}} \nGROUP BY interval;\n\n-- available variables:\n-- \t{{.start_timestamp_nano}}\n-- \t{{.end_timestamp_nano}}\n\n-- required columns (or alias):\n-- \tvalue\n-- \tinterval`,
query: `select \ntoStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 30 MINUTE) AS interval, \ntoFloat64(count()) as value \nFROM signoz_logs.distributed_logs_v2 \nWHERE timestamp BETWEEN {{.start_timestamp_nano}} AND {{.end_timestamp_nano}} \nGROUP BY interval;\n\n-- available variables:\n-- \t{{.start_timestamp_nano}}\n-- \t{{.end_timestamp_nano}}\n\n-- required columns (or alias):\n-- \tvalue\n-- \tinterval`,
legend: '',
disabled: false,
},
@ -95,7 +95,7 @@ export const traceAlertDefaults: AlertDef = {
chQueries: {
A: {
name: 'A',
query: `SELECT \n\ttoStartOfInterval(timestamp, INTERVAL 1 MINUTE) AS interval, \n\ttagMap['peer.service'] AS op_name, \n\ttoFloat64(avg(durationNano)) AS value \nFROM signoz_traces.distributed_signoz_index_v2 \nWHERE tagMap['peer.service']!='' \nAND timestamp BETWEEN {{.start_datetime}} AND {{.end_datetime}} \nGROUP BY (op_name, interval);\n\n-- available variables:\n-- \t{{.start_datetime}}\n-- \t{{.end_datetime}}\n\n-- required column alias:\n-- \tvalue\n-- \tinterval`,
query: `SELECT \n\ttoStartOfInterval(timestamp, INTERVAL 1 MINUTE) AS interval, \n\tstringTagMap['peer.service'] AS op_name, \n\ttoFloat64(avg(durationNano)) AS value \nFROM signoz_traces.distributed_signoz_index_v2 \nWHERE stringTagMap['peer.service']!='' \nAND timestamp BETWEEN {{.start_datetime}} AND {{.end_datetime}} \nGROUP BY (op_name, interval);\n\n-- available variables:\n-- \t{{.start_datetime}}\n-- \t{{.end_datetime}}\n\n-- required column alias:\n-- \tvalue\n-- \tinterval`,
legend: '',
disabled: false,
},

View File

@ -260,7 +260,7 @@ function ChartPreview({
</FailedMessageContainer>
)}
{chartData && !queryResponse.isError && (
{chartData && !queryResponse.isError && !queryResponse.isLoading && (
<GridPanelSwitch
options={options}
panelType={graphType}

View File

@ -103,6 +103,7 @@ function RuleOptions({
<Select.Option value="2">{t('option_allthetimes')}</Select.Option>
<Select.Option value="3">{t('option_onaverage')}</Select.Option>
<Select.Option value="4">{t('option_intotal')}</Select.Option>
<Select.Option value="5">{t('option_last')}</Select.Option>
</InlineSelect>
);

View File

@ -370,7 +370,10 @@ function FormAlertRules({
});
// invalidate rule in cache
ruleCache.invalidateQueries([REACT_QUERY_KEY.ALERT_RULE_DETAILS, ruleId]);
ruleCache.invalidateQueries([
REACT_QUERY_KEY.ALERT_RULE_DETAILS,
`${ruleId}`,
]);
// eslint-disable-next-line sonarjs/no-identical-functions
setTimeout(() => {

View File

@ -15,6 +15,13 @@
box-sizing: border-box;
margin: 16px 0;
border-radius: 3px;
.global-search {
.ant-input-group-addon {
border: none;
background-color: var(--bg-ink-300);
}
}
}
.height-widget {
@ -55,3 +62,15 @@
}
}
}
.lightMode {
.full-view-container {
.graph-container {
.global-search {
.ant-input-group-addon {
background-color: var(--bg-vanilla-200);
}
}
}
}
}

View File

@ -1,7 +1,11 @@
import './WidgetFullView.styles.scss';
import { LoadingOutlined, SyncOutlined } from '@ant-design/icons';
import { Button, Spin } from 'antd';
import {
LoadingOutlined,
SearchOutlined,
SyncOutlined,
} from '@ant-design/icons';
import { Button, Input, Spin } from 'antd';
import cx from 'classnames';
import { ToggleGraphProps } from 'components/Graph/types';
import Spinner from 'components/Spinner';
@ -140,7 +144,7 @@ function FullView({
const [graphsVisibilityStates, setGraphsVisibilityStates] = useState<
boolean[]
>(Array(response.data?.payload.data.result.length).fill(true));
>(Array(response.data?.payload?.data?.result?.length).fill(true));
useEffect(() => {
const {
@ -172,6 +176,10 @@ function FullView({
const isListView = widget.panelTypes === PANEL_TYPES.LIST;
const isTablePanel = widget.panelTypes === PANEL_TYPES.TABLE;
const [searchTerm, setSearchTerm] = useState<string>('');
if (response.isLoading && widget.panelTypes !== PANEL_TYPES.LIST) {
return <Spinner height="100%" size="large" tip="Loading..." />;
}
@ -216,6 +224,18 @@ function FullView({
}}
isGraphLegendToggleAvailable={canModifyChart}
>
{isTablePanel && (
<Input
addonBefore={<SearchOutlined size={14} />}
className="global-search"
placeholder="Search..."
allowClear
key={widget.id}
onChange={(e): void => {
setSearchTerm(e.target.value || '');
}}
/>
)}
<PanelWrapper
queryResponse={response}
widget={widget}
@ -226,6 +246,7 @@ function FullView({
graphVisibility={graphsVisibilityStates}
onDragSelect={onDragSelect}
tableProcessedDataRef={tableProcessedDataRef}
searchTerm={searchTerm}
/>
</GraphContainer>
</div>

View File

@ -234,6 +234,8 @@ function WidgetGraphComponent({
});
};
const [searchTerm, setSearchTerm] = useState<string>('');
const loadingState =
(queryResponse.isLoading || queryResponse.status === 'idle') &&
widget.panelTypes !== PANEL_TYPES.LIST;
@ -317,6 +319,7 @@ function WidgetGraphComponent({
isWarning={isWarning}
isFetchingResponse={isFetchingResponse}
tableProcessedDataRef={tableProcessedDataRef}
setSearchTerm={setSearchTerm}
/>
</div>
{queryResponse.isLoading && widget.panelTypes !== PANEL_TYPES.LIST && (
@ -337,6 +340,7 @@ function WidgetGraphComponent({
onDragSelect={onDragSelect}
tableProcessedDataRef={tableProcessedDataRef}
customTooltipElement={customTooltipElement}
searchTerm={searchTerm}
/>
</div>
)}

View File

@ -11,6 +11,7 @@ import { isEqual } from 'lodash-es';
import isEmpty from 'lodash-es/isEmpty';
import { useDashboard } from 'providers/Dashboard/Dashboard';
import { memo, useEffect, useRef, useState } from 'react';
import { useQueryClient } from 'react-query';
import { useDispatch, useSelector } from 'react-redux';
import { UpdateTimeInterval } from 'store/actions';
import { AppState } from 'store/reducers';
@ -48,6 +49,7 @@ function GridCardGraph({
AppState,
GlobalReducer
>((state) => state.globalTime);
const queryClient = useQueryClient();
const handleBackNavigation = (): void => {
const searchParams = new URLSearchParams(window.location.search);
@ -136,6 +138,25 @@ function GridCardGraph({
};
});
// TODO [vikrantgupta25] remove this useEffect with refactor as this is prone to race condition
// this is added to tackle the case of async communication between VariableItem.tsx and GridCard.tsx
useEffect(() => {
if (variablesToGetUpdated.length > 0) {
queryClient.cancelQueries([
maxTime,
minTime,
globalSelectedInterval,
variables,
widget?.query,
widget?.panelTypes,
widget.timePreferance,
widget.fillSpans,
requestData,
]);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [variablesToGetUpdated]);
useEffect(() => {
if (!isEqual(updatedQuery, requestData.query)) {
setRequestData((prev) => ({

View File

@ -2,7 +2,7 @@
display: flex;
justify-content: space-between;
align-items: center;
height: 30px;
height: 36px;
width: 100%;
padding: 0.5rem;
box-sizing: border-box;
@ -10,6 +10,14 @@
font-weight: 600;
cursor: move;
.ant-input-group-addon {
border: none;
background-color: var(--bg-ink-500);
}
.search-header-icons {
cursor: pointer;
}
}
.widget-header-title {
@ -19,6 +27,7 @@
.widget-header-actions {
display: flex;
align-items: center;
gap: 8px;
}
.widget-header-more-options {
visibility: hidden;
@ -30,6 +39,10 @@
padding: 8px;
}
.widget-header-more-options-visible {
visibility: visible;
}
.widget-header-hover {
visibility: visible;
}
@ -37,3 +50,11 @@
.widget-api-actions {
padding-right: 0.25rem;
}
.lightMode {
.widget-header-container {
.ant-input-group-addon {
background-color: inherit;
}
}
}

View File

@ -9,9 +9,10 @@ import {
ExclamationCircleOutlined,
FullscreenOutlined,
MoreOutlined,
SearchOutlined,
WarningOutlined,
} from '@ant-design/icons';
import { Dropdown, MenuProps, Tooltip, Typography } from 'antd';
import { Dropdown, Input, MenuProps, Tooltip, Typography } from 'antd';
import Spinner from 'components/Spinner';
import { QueryParams } from 'constants/query';
import { PANEL_TYPES } from 'constants/queryBuilder';
@ -20,8 +21,9 @@ import useComponentPermission from 'hooks/useComponentPermission';
import history from 'lib/history';
import { RowData } from 'lib/query/createTableColumnsFromQuery';
import { isEmpty } from 'lodash-es';
import { X } from 'lucide-react';
import { unparse } from 'papaparse';
import { ReactNode, useCallback, useMemo } from 'react';
import { ReactNode, useCallback, useMemo, useState } from 'react';
import { UseQueryResult } from 'react-query';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
@ -51,6 +53,7 @@ interface IWidgetHeaderProps {
isWarning: boolean;
isFetchingResponse: boolean;
tableProcessedDataRef: React.MutableRefObject<RowData[]>;
setSearchTerm: React.Dispatch<React.SetStateAction<string>>;
}
function WidgetHeader({
@ -67,6 +70,7 @@ function WidgetHeader({
isWarning,
isFetchingResponse,
tableProcessedDataRef,
setSearchTerm,
}: IWidgetHeaderProps): JSX.Element | null {
const onEditHandler = useCallback((): void => {
const widgetId = widget.id;
@ -187,6 +191,10 @@ function WidgetHeader({
const updatedMenuList = useMemo(() => generateMenuList(actions), [actions]);
const [showGlobalSearch, setShowGlobalSearch] = useState(false);
const globalSearchAvailable = widget.panelTypes === PANEL_TYPES.TABLE;
const menu = useMemo(
() => ({
items: updatedMenuList,
@ -201,46 +209,80 @@ function WidgetHeader({
return (
<div className="widget-header-container">
<Typography.Text
ellipsis
data-testid={title}
className="widget-header-title"
>
{title}
</Typography.Text>
<div className="widget-header-actions">
<div className="widget-api-actions">{threshold}</div>
{isFetchingResponse && !queryResponse.isError && (
<Spinner style={{ paddingRight: '0.25rem' }} />
)}
{queryResponse.isError && (
<Tooltip
title={errorMessage}
placement={errorTooltipPosition}
className="widget-api-actions"
{showGlobalSearch ? (
<Input
addonBefore={<SearchOutlined size={14} />}
placeholder="Search..."
bordered={false}
data-testid="widget-header-search-input"
autoFocus
addonAfter={
<X
size={14}
onClick={(e): void => {
e.stopPropagation();
e.preventDefault();
setShowGlobalSearch(false);
}}
className="search-header-icons"
/>
}
key={widget.id}
onChange={(e): void => {
setSearchTerm(e.target.value || '');
}}
/>
) : (
<>
<Typography.Text
ellipsis
data-testid={title}
className="widget-header-title"
>
<ExclamationCircleOutlined />
</Tooltip>
)}
{title}
</Typography.Text>
<div className="widget-header-actions">
<div className="widget-api-actions">{threshold}</div>
{isFetchingResponse && !queryResponse.isError && (
<Spinner style={{ paddingRight: '0.25rem' }} />
)}
{queryResponse.isError && (
<Tooltip
title={errorMessage}
placement={errorTooltipPosition}
className="widget-api-actions"
>
<ExclamationCircleOutlined />
</Tooltip>
)}
{isWarning && (
<Tooltip
title={WARNING_MESSAGE}
placement={errorTooltipPosition}
className="widget-api-actions"
>
<WarningOutlined />
</Tooltip>
)}
<Dropdown menu={menu} trigger={['hover']} placement="bottomRight">
<MoreOutlined
data-testid="widget-header-options"
className={`widget-header-more-options ${
parentHover ? 'widget-header-hover' : ''
}`}
/>
</Dropdown>
</div>
{isWarning && (
<Tooltip
title={WARNING_MESSAGE}
placement={errorTooltipPosition}
className="widget-api-actions"
>
<WarningOutlined />
</Tooltip>
)}
{globalSearchAvailable && (
<SearchOutlined
className="search-header-icons"
onClick={(): void => setShowGlobalSearch(true)}
data-testid="widget-header-search"
/>
)}
<Dropdown menu={menu} trigger={['hover']} placement="bottomRight">
<MoreOutlined
data-testid="widget-header-options"
className={`widget-header-more-options ${
parentHover ? 'widget-header-hover' : ''
} ${globalSearchAvailable ? 'widget-header-more-options-visible' : ''}`}
/>
</Dropdown>
</div>
</>
)}
</div>
);
}

View File

@ -33,7 +33,14 @@ export const Card = styled(CardComponent)<CardProps>`
}
.ant-card-body {
height: calc(100% - 30px);
${({ $panelType }): StyledCSS =>
$panelType === PANEL_TYPES.TABLE
? css`
height: 100%;
`
: css`
height: calc(100% - 30px);
`}
padding: 0;
}
`;

View File

@ -4,7 +4,7 @@ import { getYAxisFormattedValue } from 'components/Graph/yAxisConfig';
import { Events } from 'constants/events';
import { QueryTable } from 'container/QueryTable';
import { RowData } from 'lib/query/createTableColumnsFromQuery';
import { cloneDeep, get, isEmpty, set } from 'lodash-es';
import { cloneDeep, get, isEmpty } from 'lodash-es';
import { memo, ReactNode, useCallback, useEffect, useMemo } from 'react';
import { useTranslation } from 'react-i18next';
import { eventEmitter } from 'utils/getEventEmitter';
@ -38,15 +38,13 @@ function GridTableComponent({
const createDataInCorrectFormat = useCallback(
(dataSource: RowData[]): RowData[] =>
dataSource.map((d) => {
const finalObject = {};
const finalObject: Record<string, number | string> = {};
// we use the order of the columns here to have similar download as the user view
// the [] access for the object is used because the titles can contain dot(.) as well
columns.forEach((k) => {
set(
finalObject,
get(k, 'title', '') as string,
get(d, get(k, 'dataIndex', ''), 'n/a'),
);
finalObject[`${get(k, 'title', '')}`] =
d[`${get(k, 'dataIndex', '')}`] || 'n/a';
});
return finalObject as RowData;
}),
@ -86,6 +84,7 @@ function GridTableComponent({
applyColumnUnits,
originalDataSource,
]);
useEffect(() => {
if (tableProcessedDataRef) {
// eslint-disable-next-line no-param-reassign

View File

@ -14,6 +14,7 @@ export type GridTableComponentProps = {
columnUnits?: ColumnUnit;
tableProcessedDataRef?: React.MutableRefObject<RowData[]>;
sticky?: TableProps<RowData>['sticky'];
searchTerm?: string;
} & Pick<LogsExplorerTableProps, 'data'> &
Omit<TableProps<RowData>, 'columns' | 'dataSource'>;

View File

@ -1,6 +1,6 @@
/* eslint-disable react/display-name */
import { PlusOutlined } from '@ant-design/icons';
import { Input, Typography } from 'antd';
import { Flex, Input, Typography } from 'antd';
import type { ColumnsType } from 'antd/es/table/interface';
import saveAlertApi from 'api/alerts/save';
import logEvent from 'api/common/logEvent';
@ -34,12 +34,7 @@ import { GettableAlert } from 'types/api/alerts/get';
import AppReducer from 'types/reducer/app';
import DeleteAlert from './DeleteAlert';
import {
Button,
ButtonContainer,
ColumnButton,
SearchContainer,
} from './styles';
import { Button, ColumnButton, SearchContainer } from './styles';
import Status from './TableComponents/Status';
import ToggleAlertState from './ToggleAlertState';
import { alertActionLogEvent, filterAlerts } from './utils';
@ -373,21 +368,25 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
onChange={handleSearch}
defaultValue={searchString}
/>
<ButtonContainer>
<Flex gap={12}>
{addNewAlert && (
<Button
type="primary"
onClick={onClickNewAlertHandler}
icon={<PlusOutlined />}
>
New Alert
</Button>
)}
<TextToolTip
{...{
text: `More details on how to create alerts`,
url:
'https://signoz.io/docs/alerts/?utm_source=product&utm_medium=list-alerts',
urlText: 'Learn More',
}}
/>
{addNewAlert && (
<Button onClick={onClickNewAlertHandler} icon={<PlusOutlined />}>
New Alert
</Button>
)}
</ButtonContainer>
</Flex>
</SearchContainer>
<DynamicColumnTable
tablesource={TableDataSource.Alert}

View File

@ -9,12 +9,6 @@ export const SearchContainer = styled.div`
gap: 2rem;
}
`;
export const ButtonContainer = styled.div`
&&& {
display: flex;
align-items: center;
}
`;
export const Button = styled(ButtonComponent)`
&&& {

View File

@ -64,9 +64,9 @@
.dashboard-icon {
display: inline-block;
margin-top: 4px;
margin-right: 4px;
line-height: 20px;
height: 14px;
width: 14px;
}
.dot {
@ -75,6 +75,12 @@
border-radius: 50%;
}
.title-link {
display: flex;
align-items: center;
gap: 8px;
}
.title {
color: var(--bg-vanilla-100);
font-size: var(--font-size-sm);

View File

@ -91,6 +91,7 @@ function DashboardsList(): JSX.Element {
const {
data: dashboardListResponse,
isLoading: isDashboardListLoading,
isRefetching: isDashboardListRefetching,
error: dashboardFetchError,
refetch: refetchDashboardList,
} = useGetAllDashboard();
@ -458,17 +459,19 @@ function DashboardsList(): JSX.Element {
placement="left"
overlayClassName="title-toolip"
>
<Typography.Text data-testid={`dashboard-title-${index}`}>
<Link to={getLink()} className="title">
<img
src={dashboard?.image || Base64Icons[0]}
style={{ height: '14px', width: '14px' }}
alt="dashboard-image"
className="dashboard-icon"
/>
<Link to={getLink()} className="title-link">
<img
src={dashboard?.image || Base64Icons[0]}
alt="dashboard-image"
className="dashboard-icon"
/>
<Typography.Text
data-testid={`dashboard-title-${index}`}
className="title"
>
{dashboard.name}
</Link>
</Typography.Text>
</Typography.Text>
</Link>
</Tooltip>
</div>
@ -703,7 +706,9 @@ function DashboardsList(): JSX.Element {
</Flex>
</div>
{isDashboardListLoading || isFilteringDashboards ? (
{isDashboardListLoading ||
isFilteringDashboards ||
isDashboardListRefetching ? (
<div className="loading-dashboard-details">
<Skeleton.Input active size="large" className="skeleton-1" />
<Skeleton.Input active size="large" className="skeleton-1" />
@ -902,7 +907,11 @@ function DashboardsList(): JSX.Element {
columns={columns}
dataSource={data}
showSorterTooltip
loading={isDashboardListLoading || isFilteringDashboards}
loading={
isDashboardListLoading ||
isFilteringDashboards ||
isDashboardListRefetching
}
showHeader={false}
pagination={paginationConfig}
/>

View File

@ -0,0 +1,34 @@
.empty-container {
display: flex;
justify-content: center;
align-items: center;
height: 100%;
}
.infra-metrics-container {
.views-tabs {
margin-bottom: 1rem;
}
}
.infra-metrics-card {
margin: 1rem 0;
height: 300px;
padding: 10px;
.ant-card-body {
padding: 0;
}
.chart-container {
width: 100%;
height: 100%;
}
.no-data-container {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
}
}

View File

@ -0,0 +1,94 @@
import './InfraMetrics.styles.scss';
import { Empty, Radio } from 'antd';
import { RadioChangeEvent } from 'antd/lib';
import { History, Table } from 'lucide-react';
import { useState } from 'react';
import { VIEW_TYPES } from './constants';
import NodeMetrics from './NodeMetrics';
import PodMetrics from './PodMetrics';
interface MetricsDataProps {
podName: string;
nodeName: string;
hostName: string;
clusterName: string;
logLineTimestamp: string;
}
function InfraMetrics({
podName,
nodeName,
hostName,
clusterName,
logLineTimestamp,
}: MetricsDataProps): JSX.Element {
const [selectedView, setSelectedView] = useState<string>(() =>
podName ? VIEW_TYPES.POD : VIEW_TYPES.NODE,
);
const handleModeChange = (e: RadioChangeEvent): void => {
setSelectedView(e.target.value);
};
if (!podName && !nodeName && !hostName) {
return (
<div className="empty-container">
<Empty
image={Empty.PRESENTED_IMAGE_SIMPLE}
description="No data available. Please select a valid log line containing a pod, node, or host attributes to view metrics."
/>
</div>
);
}
return (
<div className="infra-metrics-container">
<Radio.Group
className="views-tabs"
onChange={handleModeChange}
value={selectedView}
>
<Radio.Button
className={selectedView === VIEW_TYPES.NODE ? 'selected_view tab' : 'tab'}
value={VIEW_TYPES.NODE}
>
<div className="view-title">
<Table size={14} />
Node
</div>
</Radio.Button>
{podName && (
<Radio.Button
className={selectedView === VIEW_TYPES.POD ? 'selected_view tab' : 'tab'}
value={VIEW_TYPES.POD}
>
<div className="view-title">
<History size={14} />
Pod
</div>
</Radio.Button>
)}
</Radio.Group>
{/* TODO(Rahul): Make a common config driven component for this and other infra metrics components */}
{selectedView === VIEW_TYPES.NODE && (
<NodeMetrics
nodeName={nodeName}
clusterName={clusterName}
hostName={hostName}
logLineTimestamp={logLineTimestamp}
/>
)}
{selectedView === VIEW_TYPES.POD && podName && (
<PodMetrics
podName={podName}
clusterName={clusterName}
logLineTimestamp={logLineTimestamp}
/>
)}
</div>
);
}
export default InfraMetrics;

View File

@ -0,0 +1,140 @@
import { Card, Col, Row, Skeleton, Typography } from 'antd';
import cx from 'classnames';
import Uplot from 'components/Uplot';
import { ENTITY_VERSION_V4 } from 'constants/app';
import dayjs from 'dayjs';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useResizeObserver } from 'hooks/useDimensions';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { getUPlotChartOptions } from 'lib/uPlotLib/getUplotChartOptions';
import { getUPlotChartData } from 'lib/uPlotLib/utils/getUplotChartData';
import { useMemo, useRef } from 'react';
import { useQueries, UseQueryResult } from 'react-query';
import { SuccessResponse } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import {
getHostQueryPayload,
getNodeQueryPayload,
hostWidgetInfo,
nodeWidgetInfo,
} from './constants';
function NodeMetrics({
nodeName,
clusterName,
hostName,
logLineTimestamp,
}: {
nodeName: string;
clusterName: string;
hostName: string;
logLineTimestamp: string;
}): JSX.Element {
const { start, end, verticalLineTimestamp } = useMemo(() => {
const logTimestamp = dayjs(logLineTimestamp);
const now = dayjs();
const startTime = logTimestamp.subtract(3, 'hour');
const endTime = logTimestamp.add(3, 'hour').isBefore(now)
? logTimestamp.add(3, 'hour')
: now;
return {
start: startTime.unix(),
end: endTime.unix(),
verticalLineTimestamp: logTimestamp.unix(),
};
}, [logLineTimestamp]);
const queryPayloads = useMemo(() => {
if (nodeName) {
return getNodeQueryPayload(clusterName, nodeName, start, end);
}
return getHostQueryPayload(hostName, start, end);
}, [nodeName, hostName, clusterName, start, end]);
const widgetInfo = nodeName ? nodeWidgetInfo : hostWidgetInfo;
const queries = useQueries(
queryPayloads.map((payload) => ({
queryKey: ['metrics', payload, ENTITY_VERSION_V4, 'NODE'],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, ENTITY_VERSION_V4),
enabled: !!payload,
})),
);
const isDarkMode = useIsDarkMode();
const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef);
const chartData = useMemo(
() => queries.map(({ data }) => getUPlotChartData(data?.payload)),
[queries],
);
const options = useMemo(
() =>
queries.map(({ data }, idx) =>
getUPlotChartOptions({
apiResponse: data?.payload,
isDarkMode,
dimensions,
yAxisUnit: widgetInfo[idx].yAxisUnit,
softMax: null,
softMin: null,
minTimeScale: start,
maxTimeScale: end,
verticalLineTimestamp,
}),
),
[
queries,
isDarkMode,
dimensions,
widgetInfo,
start,
verticalLineTimestamp,
end,
],
);
const renderCardContent = (
query: UseQueryResult<SuccessResponse<MetricRangePayloadProps>, unknown>,
idx: number,
): JSX.Element => {
if (query.isLoading) {
return <Skeleton />;
}
if (query.error) {
const errorMessage =
(query.error as Error)?.message || 'Something went wrong';
return <div>{errorMessage}</div>;
}
return (
<div
className={cx('chart-container', {
'no-data-container':
!query.isLoading && !query?.data?.payload?.data?.result?.length,
})}
>
<Uplot options={options[idx]} data={chartData[idx]} />
</div>
);
};
return (
<Row gutter={24}>
{queries.map((query, idx) => (
<Col span={12} key={widgetInfo[idx].title}>
<Typography.Text>{widgetInfo[idx].title}</Typography.Text>
<Card bordered className="infra-metrics-card" ref={graphRef}>
{renderCardContent(query, idx)}
</Card>
</Col>
))}
</Row>
);
}
export default NodeMetrics;

View File

@ -0,0 +1,121 @@
import { Card, Col, Row, Skeleton, Typography } from 'antd';
import cx from 'classnames';
import Uplot from 'components/Uplot';
import { ENTITY_VERSION_V4 } from 'constants/app';
import dayjs from 'dayjs';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useResizeObserver } from 'hooks/useDimensions';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { getUPlotChartOptions } from 'lib/uPlotLib/getUplotChartOptions';
import { getUPlotChartData } from 'lib/uPlotLib/utils/getUplotChartData';
import { useMemo, useRef } from 'react';
import { useQueries, UseQueryResult } from 'react-query';
import { SuccessResponse } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { getPodQueryPayload, podWidgetInfo } from './constants';
function PodMetrics({
podName,
clusterName,
logLineTimestamp,
}: {
podName: string;
clusterName: string;
logLineTimestamp: string;
}): JSX.Element {
const { start, end, verticalLineTimestamp } = useMemo(() => {
const logTimestamp = dayjs(logLineTimestamp);
const now = dayjs();
const startTime = logTimestamp.subtract(3, 'hour');
const endTime = logTimestamp.add(3, 'hour').isBefore(now)
? logTimestamp.add(3, 'hour')
: now;
return {
start: startTime.unix(),
end: endTime.unix(),
verticalLineTimestamp: logTimestamp.unix(),
};
}, [logLineTimestamp]);
const queryPayloads = useMemo(
() => getPodQueryPayload(clusterName, podName, start, end),
[clusterName, end, podName, start],
);
const queries = useQueries(
queryPayloads.map((payload) => ({
queryKey: ['metrics', payload, ENTITY_VERSION_V4, 'POD'],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, ENTITY_VERSION_V4),
enabled: !!payload,
})),
);
const isDarkMode = useIsDarkMode();
const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef);
const chartData = useMemo(
() => queries.map(({ data }) => getUPlotChartData(data?.payload)),
[queries],
);
const options = useMemo(
() =>
queries.map(({ data }, idx) =>
getUPlotChartOptions({
apiResponse: data?.payload,
isDarkMode,
dimensions,
yAxisUnit: podWidgetInfo[idx].yAxisUnit,
softMax: null,
softMin: null,
minTimeScale: start,
maxTimeScale: end,
verticalLineTimestamp,
}),
),
[queries, isDarkMode, dimensions, start, verticalLineTimestamp, end],
);
const renderCardContent = (
query: UseQueryResult<SuccessResponse<MetricRangePayloadProps>, unknown>,
idx: number,
): JSX.Element => {
if (query.isLoading) {
return <Skeleton />;
}
if (query.error) {
const errorMessage =
(query.error as Error)?.message || 'Something went wrong';
return <div>{errorMessage}</div>;
}
return (
<div
className={cx('chart-container', {
'no-data-container':
!query.isLoading && !query?.data?.payload?.data?.result?.length,
})}
>
<Uplot options={options[idx]} data={chartData[idx]} />
</div>
);
};
return (
<Row gutter={24}>
{queries.map((query, idx) => (
<Col span={12} key={podWidgetInfo[idx].title}>
<Typography.Text>{podWidgetInfo[idx].title}</Typography.Text>
<Card bordered className="infra-metrics-card" ref={graphRef}>
{renderCardContent(query, idx)}
</Card>
</Col>
))}
</Row>
);
}
export default PodMetrics;

File diff suppressed because it is too large Load Diff

View File

@ -122,10 +122,10 @@ function TableView({
fieldValue: string,
) => (): void => {
handleClick(operator, fieldKey, fieldValue);
if (operator === OPERATORS.IN) {
if (operator === OPERATORS['=']) {
setIsFilterInLoading(true);
}
if (operator === OPERATORS.NIN) {
if (operator === OPERATORS['!=']) {
setIsFilterOutLoading(true);
}
};

View File

@ -139,7 +139,7 @@ export function TableViewActions(
<ArrowDownToDot size={14} style={{ transform: 'rotate(90deg)' }} />
)
}
onClick={onClickHandler(OPERATORS.IN, fieldFilterKey, fieldData.value)}
onClick={onClickHandler(OPERATORS['='], fieldFilterKey, fieldData.value)}
/>
</Tooltip>
<Tooltip title="Filter out value">
@ -152,7 +152,11 @@ export function TableViewActions(
<ArrowUpFromDot size={14} style={{ transform: 'rotate(90deg)' }} />
)
}
onClick={onClickHandler(OPERATORS.NIN, fieldFilterKey, fieldData.value)}
onClick={onClickHandler(
OPERATORS['!='],
fieldFilterKey,
fieldData.value,
)}
/>
</Tooltip>
{!isOldLogsExplorerOrLiveLogsPage && (

View File

@ -1,6 +1,7 @@
import LogDetail from 'components/LogDetail';
import { VIEW_TYPES } from 'components/LogDetail/constants';
import ROUTES from 'constants/routes';
import { getOldLogsOperatorFromNew } from 'hooks/logs/useActiveLog';
import { getGeneratedFilterQueryString } from 'lib/getGeneratedFilterQueryString';
import getStep from 'lib/getStep';
import { getIdConditions } from 'pages/Logs/utils';
@ -57,10 +58,11 @@ function LogDetailedView({
const handleAddToQuery = useCallback(
(fieldKey: string, fieldValue: string, operator: string) => {
const newOperator = getOldLogsOperatorFromNew(operator);
const updatedQueryString = getGeneratedFilterQueryString(
fieldKey,
fieldValue,
operator,
newOperator,
queryString,
);
@ -71,10 +73,11 @@ function LogDetailedView({
const handleClickActionItem = useCallback(
(fieldKey: string, fieldValue: string, operator: string): void => {
const newOperator = getOldLogsOperatorFromNew(operator);
const updatedQueryString = getGeneratedFilterQueryString(
fieldKey,
fieldValue,
operator,
newOperator,
queryString,
);

View File

@ -1,6 +1,5 @@
import './LogsExplorerQuerySection.styles.scss';
import { FeatureKeys } from 'constants/features';
import {
initialQueriesMap,
OPERATORS,
@ -9,14 +8,12 @@ import {
import ExplorerOrderBy from 'container/ExplorerOrderBy';
import { QueryBuilder } from 'container/QueryBuilder';
import { OrderByFilterProps } from 'container/QueryBuilder/filters/OrderByFilter/OrderByFilter.interfaces';
import QueryBuilderSearch from 'container/QueryBuilder/filters/QueryBuilderSearch';
import QueryBuilderSearchV2 from 'container/QueryBuilder/filters/QueryBuilderSearchV2/QueryBuilderSearchV2';
import { QueryBuilderProps } from 'container/QueryBuilder/QueryBuilder.interfaces';
import { useGetPanelTypesQueryParam } from 'hooks/queryBuilder/useGetPanelTypesQueryParam';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations';
import { useShareBuilderUrl } from 'hooks/queryBuilder/useShareBuilderUrl';
import useFeatureFlags from 'hooks/useFeatureFlag';
import {
prepareQueryWithDefaultTimestamp,
SELECTED_VIEWS,
@ -89,26 +86,15 @@ function LogExplorerQuerySection({
[handleChangeQueryData],
);
const isSearchV2Enabled =
useFeatureFlags(FeatureKeys.QUERY_BUILDER_SEARCH_V2)?.active || false;
return (
<>
{selectedView === SELECTED_VIEWS.SEARCH && (
<div className="qb-search-view-container">
{isSearchV2Enabled ? (
<QueryBuilderSearchV2
query={query}
onChange={handleChangeTagFilters}
whereClauseConfig={filterConfigs?.filters}
/>
) : (
<QueryBuilderSearch
query={query}
onChange={handleChangeTagFilters}
whereClauseConfig={filterConfigs?.filters}
/>
)}
<QueryBuilderSearchV2
query={query}
onChange={handleChangeTagFilters}
whereClauseConfig={filterConfigs?.filters}
/>
</div>
)}

View File

@ -3,6 +3,7 @@ import { QueryData } from 'types/api/widgets/getQuery';
export type LogsExplorerChartProps = {
data: QueryData[];
isLoading: boolean;
isLogsExplorerViews?: boolean;
isLabelEnabled?: boolean;
className?: string;
};

View File

@ -16,12 +16,14 @@ import { UpdateTimeInterval } from 'store/actions';
import { LogsExplorerChartProps } from './LogsExplorerChart.interfaces';
import { CardStyled } from './LogsExplorerChart.styled';
import { getColorsForSeverityLabels } from './utils';
function LogsExplorerChart({
data,
isLoading,
isLabelEnabled = true,
className,
isLogsExplorerViews = false,
}: LogsExplorerChartProps): JSX.Element {
const dispatch = useDispatch();
const urlQuery = useUrlQuery();
@ -29,15 +31,19 @@ function LogsExplorerChart({
const handleCreateDatasets: Required<GetChartDataProps>['createDataset'] = useCallback(
(element, index, allLabels) => ({
data: element,
backgroundColor: colors[index % colors.length] || themeColors.red,
borderColor: colors[index % colors.length] || themeColors.red,
backgroundColor: isLogsExplorerViews
? getColorsForSeverityLabels(allLabels[index], index)
: colors[index % colors.length] || themeColors.red,
borderColor: isLogsExplorerViews
? getColorsForSeverityLabels(allLabels[index], index)
: colors[index % colors.length] || themeColors.red,
...(isLabelEnabled
? {
label: allLabels[index],
}
: {}),
}),
[isLabelEnabled],
[isLabelEnabled, isLogsExplorerViews],
);
const onDragSelect = useCallback(
@ -112,6 +118,7 @@ function LogsExplorerChart({
<Graph
name="logsExplorerChart"
data={graphData.data}
isStacked={isLogsExplorerViews}
type="bar"
animate
onDragSelect={onDragSelect}

View File

@ -0,0 +1,39 @@
import { Color } from '@signozhq/design-tokens';
import { themeColors } from 'constants/theme';
import { colors } from 'lib/getRandomColor';
export function getColorsForSeverityLabels(
label: string,
index: number,
): string {
const lowerCaseLabel = label.toLowerCase();
if (lowerCaseLabel.includes(`{severity_text="trace"}`)) {
return Color.BG_FOREST_400;
}
if (lowerCaseLabel.includes(`{severity_text="debug"}`)) {
return Color.BG_AQUA_500;
}
if (
lowerCaseLabel.includes(`{severity_text="info"}`) ||
lowerCaseLabel.includes(`{severity_text=""}`)
) {
return Color.BG_ROBIN_500;
}
if (lowerCaseLabel.includes(`{severity_text="warn"}`)) {
return Color.BG_AMBER_500;
}
if (lowerCaseLabel.includes(`{severity_text="error"}`)) {
return Color.BG_CHERRY_500;
}
if (lowerCaseLabel.includes(`{severity_text="fatal"}`)) {
return Color.BG_SAKURA_500;
}
return colors[index % colors.length] || themeColors.red;
}

View File

@ -147,6 +147,13 @@
}
.logs-histogram {
.ant-card-body {
height: 140px;
min-height: 140px;
padding: 0 16px 22px 16px;
font-family: 'Geist Mono';
}
margin-bottom: 0px;
}
}

View File

@ -64,6 +64,7 @@ import { useHistory } from 'react-router-dom';
import { AppState } from 'store/reducers';
import { Dashboard } from 'types/api/dashboard/getAll';
import { ILog } from 'types/api/logs/log';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import {
IBuilderQuery,
OrderByPayload,
@ -132,6 +133,9 @@ function LogsExplorerViews({
// State
const [page, setPage] = useState<number>(1);
const [logs, setLogs] = useState<ILog[]>([]);
const [lastLogLineTimestamp, setLastLogLineTimestamp] = useState<
number | string | null
>();
const [requestData, setRequestData] = useState<Query | null>(null);
const [showFormatMenuItems, setShowFormatMenuItems] = useState(false);
const [queryId, setQueryId] = useState<string>(v4());
@ -188,6 +192,16 @@ function LogsExplorerViews({
const modifiedQueryData: IBuilderQuery = {
...listQuery,
aggregateOperator: LogsAggregatorOperator.COUNT,
groupBy: [
{
key: 'severity_text',
dataType: DataTypes.String,
type: '',
isColumn: true,
isJSON: false,
id: 'severity_text--string----true',
},
],
};
const modifiedQuery: Query = {
@ -259,6 +273,14 @@ function LogsExplorerViews({
start: minTime,
end: maxTime,
}),
// send the lastLogTimeStamp only when the panel type is list and the orderBy is timestamp and the order is desc
lastLogLineTimestamp:
panelType === PANEL_TYPES.LIST &&
requestData?.builder?.queryData?.[0]?.orderBy?.[0]?.columnName ===
'timestamp' &&
requestData?.builder?.queryData?.[0]?.orderBy?.[0]?.order === 'desc'
? lastLogLineTimestamp
: undefined,
},
undefined,
listQueryKeyRef,
@ -336,6 +358,10 @@ function LogsExplorerViews({
pageSize: nextPageSize,
});
// initialise the last log timestamp to null as we don't have the logs.
// as soon as we scroll to the end of the logs we set the lastLogLineTimestamp to the last log timestamp.
setLastLogLineTimestamp(lastLog.timestamp);
setPage((prevPage) => prevPage + 1);
setRequestData(newRequestData);
@ -528,6 +554,11 @@ function LogsExplorerViews({
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [data]);
useEffect(() => {
// clear the lastLogLineTimestamp when the data changes
setLastLogLineTimestamp(null);
}, [data]);
useEffect(() => {
if (
requestData?.id !== stagedQuery?.id ||
@ -661,6 +692,7 @@ function LogsExplorerViews({
className="logs-histogram"
isLoading={isFetchingListChartData || isLoadingListChartData}
data={chartData}
isLogsExplorerViews={panelType === PANEL_TYPES.LIST}
/>
)}

View File

@ -28,6 +28,20 @@ const lodsQueryServerRequest = (): void =>
),
);
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
// mocking the graph components in this test as this should be handled separately
jest.mock(
'container/TimeSeriesView/TimeSeriesView',

View File

@ -63,6 +63,8 @@
height: 40px;
justify-content: end;
padding: 0 8px;
margin-top: 12px;
margin-bottom: 2px;
}
}

View File

@ -130,12 +130,16 @@
.left-section {
display: flex;
flex-wrap: wrap;
align-items: center;
gap: 8px;
width: 45%;
.dashboard-img {
height: 16px;
width: 16px;
}
.dashboard-title {
color: #fff;
font-family: Inter;

View File

@ -306,16 +306,13 @@ function DashboardDescription(props: DashboardDescriptionProps): JSX.Element {
</div>
<section className="dashboard-details">
<div className="left-section">
<img src={image} alt="dashboard-img" className="dashboard-img" />
<Tooltip title={title.length > 30 ? title : ''}>
<Typography.Text
className="dashboard-title"
data-testid="dashboard-title"
>
<img
src={image}
alt="dashboard-img"
style={{ width: '16px', height: '16px' }}
/>{' '}
{' '}
{title}
</Typography.Text>
</Tooltip>

View File

@ -43,6 +43,15 @@
.ant-select-item {
display: flex;
align-items: center;
gap: 8px;
}
.rc-virtual-list-holder {
[data-testid='option-ALL'] {
border-bottom: 1px solid var(--bg-slate-400);
padding-bottom: 12px;
margin-bottom: 8px;
}
}
.all-label {
@ -56,28 +65,25 @@
}
.dropdown-value {
display: flex;
justify-content: space-between;
align-items: center;
display: grid;
grid-template-columns: 1fr max-content;
.option-text {
max-width: 180px;
padding: 0 8px;
}
.toggle-tag-label {
padding-left: 8px;
right: 40px;
font-weight: normal;
position: absolute;
font-weight: 500;
}
}
}
}
.dropdown-styles {
min-width: 300px;
max-width: 350px;
min-width: 400px;
max-width: 500px;
}
.lightMode {

View File

@ -1,14 +1,8 @@
import '@testing-library/jest-dom/extend-expect';
import {
act,
fireEvent,
render,
screen,
waitFor,
} from '@testing-library/react';
import MockQueryClientProvider from 'providers/test/MockQueryClientProvider';
import React, { useEffect } from 'react';
import { act, fireEvent, render, screen, waitFor } from 'tests/test-utils';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import VariableItem from './VariableItem';

View File

@ -1,3 +1,4 @@
/* eslint-disable sonarjs/cognitive-complexity */
/* eslint-disable jsx-a11y/click-events-have-key-events */
/* eslint-disable jsx-a11y/no-static-element-interactions */
/* eslint-disable @typescript-eslint/no-explicit-any */
@ -25,8 +26,11 @@ import { debounce, isArray, isString } from 'lodash-es';
import map from 'lodash-es/map';
import { ChangeEvent, memo, useEffect, useMemo, useState } from 'react';
import { useQuery } from 'react-query';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { VariableResponseProps } from 'types/api/dashboard/variables/query';
import { GlobalReducer } from 'types/reducer/globalTime';
import { popupContainer } from 'utils/selectPopupContainer';
import { variablePropsToPayloadVariables } from '../utils';
@ -58,14 +62,14 @@ interface VariableItemProps {
const getSelectValue = (
selectedValue: IDashboardVariable['selectedValue'],
variableData: IDashboardVariable,
): string | string[] => {
): string | string[] | undefined => {
if (Array.isArray(selectedValue)) {
if (!variableData.multiSelect && selectedValue.length === 1) {
return selectedValue[0]?.toString() || '';
return selectedValue[0]?.toString();
}
return selectedValue.map((item) => item.toString());
}
return selectedValue?.toString() || '';
return selectedValue?.toString();
};
// eslint-disable-next-line sonarjs/cognitive-complexity
@ -80,6 +84,23 @@ function VariableItem({
[],
);
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
useEffect(() => {
if (variableData.allSelected && variableData.type === 'QUERY') {
setVariablesToGetUpdated((prev) => {
const variablesQueue = [...prev.filter((v) => v !== variableData.name)];
if (variableData.name) {
variablesQueue.push(variableData.name);
}
return variablesQueue;
});
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [minTime, maxTime]);
const [errorMessage, setErrorMessage] = useState<null | string>(null);
const getDependentVariables = (queryValue: string): string[] => {
@ -111,7 +132,14 @@ function VariableItem({
const variableKey = dependentVariablesStr.replace(/\s/g, '');
return [REACT_QUERY_KEY.DASHBOARD_BY_ID, variableName, variableKey];
// added this time dependency for variables query as API respects the passed time range now
return [
REACT_QUERY_KEY.DASHBOARD_BY_ID,
variableName,
variableKey,
`${minTime}`,
`${maxTime}`,
];
};
// eslint-disable-next-line sonarjs/cognitive-complexity
@ -151,10 +179,14 @@ function VariableItem({
valueNotInList = true;
}
}
// variablesData.allSelected is added for the case where on change of options we need to update the
// local storage
if (
variableData.type === 'QUERY' &&
variableData.name &&
(variablesToGetUpdated.includes(variableData.name) || valueNotInList)
(variablesToGetUpdated.includes(variableData.name) ||
valueNotInList ||
variableData.allSelected)
) {
let value = variableData.selectedValue;
let allSelected = false;
@ -268,7 +300,7 @@ function VariableItem({
e.stopPropagation();
e.preventDefault();
const isChecked =
variableData.allSelected || selectValue.includes(ALL_SELECT_VALUE);
variableData.allSelected || selectValue?.includes(ALL_SELECT_VALUE);
if (isChecked) {
handleChange([]);
@ -338,8 +370,8 @@ function VariableItem({
(Array.isArray(selectValue) && selectValue?.includes(option.toString()));
if (isChecked) {
if (mode === ToggleTagValue.Only) {
handleChange(option.toString());
if (mode === ToggleTagValue.Only && variableData.multiSelect) {
handleChange([option.toString()]);
} else if (!variableData.multiSelect) {
handleChange(option.toString());
} else {
@ -430,6 +462,7 @@ function VariableItem({
<span>+ {omittedValues.length} </span>
</Tooltip>
)}
allowClear
>
{enableSelectAll && (
<Select.Option data-testid="option-ALL" value={ALL_SELECT_VALUE}>
@ -468,11 +501,17 @@ function VariableItem({
{...retProps(option as string)}
onClick={(e): void => handleToggle(e as any, option as string)}
>
<Tooltip title={option.toString()} placement="bottomRight">
<Typography.Text ellipsis className="option-text">
{option.toString()}
</Typography.Text>
</Tooltip>
<Typography.Text
ellipsis={{
tooltip: {
placement: variableData.multiSelect ? 'top' : 'right',
autoAdjustOverflow: true,
},
}}
className="option-text"
>
{option.toString()}
</Typography.Text>
{variableData.multiSelect &&
optionState.tag === option.toString() &&

View File

@ -74,7 +74,7 @@ export const panelTypeVsYAxisUnit: { [key in PANEL_TYPES]: boolean } = {
[PANEL_TYPES.VALUE]: true,
[PANEL_TYPES.TABLE]: false,
[PANEL_TYPES.LIST]: false,
[PANEL_TYPES.PIE]: false,
[PANEL_TYPES.PIE]: true,
[PANEL_TYPES.BAR]: true,
[PANEL_TYPES.HISTOGRAM]: false,
[PANEL_TYPES.TRACE]: false,

View File

@ -211,7 +211,11 @@ function RightContainer({
<YAxisUnitSelector
defaultValue={yAxisUnit}
onSelect={setYAxisUnit}
fieldLabel={selectedGraphType === 'Value' ? 'Unit' : 'Y Axis Unit'}
fieldLabel={
selectedGraphType === 'Value' || selectedGraphType === 'Pie'
? 'Unit'
: 'Y Axis Unit'
}
/>
)}
{allowSoftMinMax && (

View File

@ -16,6 +16,7 @@ function PanelWrapper({
selectedGraph,
tableProcessedDataRef,
customTooltipElement,
searchTerm,
}: PanelWrapperProps): JSX.Element {
const Component = PanelTypeVsPanelWrapper[
selectedGraph || widget.panelTypes
@ -39,6 +40,7 @@ function PanelWrapper({
selectedGraph={selectedGraph}
tableProcessedDataRef={tableProcessedDataRef}
customTooltipElement={customTooltipElement}
searchTerm={searchTerm}
/>
);
}

View File

@ -4,6 +4,7 @@ import { Color } from '@signozhq/design-tokens';
import { Group } from '@visx/group';
import { Pie } from '@visx/shape';
import { useTooltip, useTooltipInPortal } from '@visx/tooltip';
import { getYAxisFormattedValue } from 'components/Graph/yAxisConfig';
import { themeColors } from 'constants/theme';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
@ -129,7 +130,12 @@ function PiePanelWrapper({
showTooltip({
tooltipData: {
label,
value: arc.data.value,
// do not update the unit in the data as the arc allotment is based on value
// and treats 4K smaller than 40
value: getYAxisFormattedValue(
arc.data.value,
widget?.yAxisUnit || 'none',
),
color: arc.data.color,
key: label,
},

View File

@ -8,6 +8,7 @@ function TablePanelWrapper({
widget,
queryResponse,
tableProcessedDataRef,
searchTerm,
}: PanelWrapperProps): JSX.Element {
const panelData =
(queryResponse.data?.payload?.data?.result?.[0] as any)?.table || [];
@ -20,6 +21,7 @@ function TablePanelWrapper({
columnUnits={widget.columnUnits}
tableProcessedDataRef={tableProcessedDataRef}
sticky={widget.panelTypes === PANEL_TYPES.TABLE}
searchTerm={searchTerm}
// eslint-disable-next-line react/jsx-props-no-spreading
{...GRID_TABLE_CONFIG}
/>

View File

@ -23,6 +23,7 @@ export type PanelWrapperProps = {
onDragSelect: (start: number, end: number) => void;
selectedGraph?: PANEL_TYPES;
tableProcessedDataRef?: React.MutableRefObject<RowData[]>;
searchTerm?: string;
customTooltipElement?: HTMLDivElement;
};

View File

@ -9,6 +9,20 @@ import store from 'store';
import ChangeHistory from '../index';
import { pipelineData, pipelineDataHistory } from './testUtils';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
const queryClient = new QueryClient({
defaultOptions: {
queries: {

View File

@ -9,6 +9,20 @@ import store from 'store';
import { pipelineMockData } from '../mocks/pipeline';
import AddNewPipeline from '../PipelineListsView/AddNewPipeline';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
export function matchMedia(): void {
Object.defineProperty(window, 'matchMedia', {
writable: true,

View File

@ -9,6 +9,20 @@ import { pipelineMockData } from '../mocks/pipeline';
import AddNewProcessor from '../PipelineListsView/AddNewProcessor';
import { matchMedia } from './AddNewPipeline.test';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
beforeAll(() => {
matchMedia();
});

View File

@ -6,6 +6,20 @@ import { MemoryRouter } from 'react-router-dom';
import i18n from 'ReactI18';
import store from 'store';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('PipelinePage container test', () => {
it('should render DeleteAction section', () => {
const { asFragment } = render(

View File

@ -6,6 +6,20 @@ import { MemoryRouter } from 'react-router-dom';
import i18n from 'ReactI18';
import store from 'store';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('PipelinePage container test', () => {
it('should render DragAction section', () => {
const { asFragment } = render(

View File

@ -6,6 +6,20 @@ import { MemoryRouter } from 'react-router-dom';
import i18n from 'ReactI18';
import store from 'store';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('PipelinePage container test', () => {
it('should render EditAction section', () => {
const { asFragment } = render(

View File

@ -8,6 +8,20 @@ import store from 'store';
import { pipelineMockData } from '../mocks/pipeline';
import PipelineActions from '../PipelineListsView/TableComponents/PipelineActions';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('PipelinePage container test', () => {
it('should render PipelineActions section', () => {
const { asFragment } = render(

View File

@ -9,6 +9,20 @@ import { pipelineMockData } from '../mocks/pipeline';
import PipelineExpandView from '../PipelineListsView/PipelineExpandView';
import { matchMedia } from './AddNewPipeline.test';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
beforeAll(() => {
matchMedia();
});

View File

@ -11,6 +11,20 @@ import store from 'store';
import { pipelineApiResponseMockData } from '../mocks/pipeline';
import PipelineListsView from '../PipelineListsView';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
const samplePipelinePreviewResponse = {
isLoading: false,
logs: [

View File

@ -11,6 +11,20 @@ import { v4 } from 'uuid';
import PipelinePageLayout from '../Layouts/Pipeline';
import { matchMedia } from './AddNewPipeline.test';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
beforeAll(() => {
matchMedia();
});

View File

@ -7,6 +7,20 @@ import store from 'store';
import TagInput from '../components/TagInput';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('Pipeline Page', () => {
it('should render TagInput section', () => {
const { asFragment } = render(

View File

@ -11,6 +11,20 @@ import {
getTableColumn,
} from '../PipelineListsView/utils';
jest.mock('uplot', () => {
const paths = {
spline: jest.fn(),
bars: jest.fn(),
};
const uplotMock = jest.fn(() => ({
paths,
}));
return {
paths,
default: uplotMock,
};
});
describe('Utils testing of Pipeline Page', () => {
test('it should be check form field of add pipeline', () => {
expect(pipelineFields.length).toBe(3);

Some files were not shown because too many files have changed in this diff Show More