2

I am trying to setup slack alerting for when a probe for an endpoint is down. I am using black box exporter for this.

First ive managed to get the alerts for when a probe returns failure by using the following config

    prometheusSpec:
      additionalScrapeConfigs:
        ## Monitor external services
        - job_name: "blackbox-http"
          metrics_path: /probe
          params:
            module: [http_2xx] # Look for a HTTP 200 response.
          static_configs:
            - targets:
                - https://argocd.pe-dev.securekey.com
          relabel_configs:
            - source_labels: [__address__]
              target_label: __param_target
            - source_labels: [__param_target]
              target_label: instance
            - target_label: __address__
              replacement: monitoring-prometheus-blackbox-exporter:9115
  additionalPrometheusRules:
    - name: blackbox-probe.rules.yaml
      groups:
        - name: blackbox-probe.rules
          rules:
            - alert: ProbeFailing
              annotations:
                description: "Probe failing for endpoint {{ $labels.instance }} for 1m"
              expr: probe_success == 0
              for: 1m
              labels:
                severity: critical

when i take down the endpoint, alertmanager registers the alert

enter image description here

Now my next task to to generate a slack message. for that ive made the following configs in alertmanager

    config:
      global:
        slack_api_url: "<Slack incoming webhook url that works. ive tested it manually>"
        resolve_timeout: 5m
      inhibit_rules:
        - source_match_re:
            alertname: KubeDaemonSetRolloutStuck|KubeDaemonSetMisScheduled|KubeDaemonSetNotScheduled
            daemonset: monitoring-prometheus-node-exporter
          target_match_re:
            pod_name: ^(monitoring-prometheus-node-exporter-).*
      route:
        group_by: ["namespace", "alertname"]
        group_wait: 30s
        group_interval: 5m
        repeat_interval: 12h
        receiver: 'null'
        routes:
          - match:
              alertname: ProbeFailing
              severity: critical
            match_re:
              instance: https://argocd.pe-dev.securekey.com/
            receiver: slack-pe
            continue: true
      receivers:
      - name: 'null'
      - name: slack-pe
        slack_configs:
          - channel: "#test-alert"
            title: '{{ template "alertmanager.slack.titletext" . }}'
            title_link: '{{ .ExternalURL }}{{ template "alertmanager.slack.titlelink" . }}'
            text: '{{ template "alertmanager.slack.text" . }}'
            send_resolved: true
      templates:
      - "slack_templates.tmpl"

    ## Pass the Alertmanager configuration directives through Helm's templating
    ## engine. If the Alertmanager configuration contains Alertmanager templates,
    ## they'll need to be properly escaped so that they are not interpreted by
    ## Helm
    ## ref: https://helm.sh/docs/developing_charts/#using-the-tpl-function
    ##      https://prometheus.io/docs/alerting/configuration/#tmpl_string
    ##      https://prometheus.io/docs/alerting/notifications/
    ##      https://prometheus.io/docs/alerting/notification_examples/
    tplConfig: false

    ## Alertmanager template files to format alerts
    ## By default, templateFiles are placed in /etc/alertmanager/config/ and if
    ## they have a .tmpl file suffix will be loaded. See config.templates above
    ## to change, add other suffixes. If adding other suffixes, be sure to update
    ## config.templates above to include those suffixes.
    ## ref: https://prometheus.io/docs/alerting/notifications/
    ##      https://prometheus.io/docs/alerting/notification_examples/
    ##
    templateFiles:
      {
        slack_templates.tmpl:
          '{{ define "alertmanager.slack.titletext" }}(TEST :: {{ index .CommonLabels "severity" }}) {{ index .CommonLabels "alertname" }} [{{ .Status }}{{ if eq .Status "firing" }}:{{ .Alerts.Firing | len }}{{ end }}]{{ end }}

          {{ define "alertmanager.slack.titlelink" }}/#/alerts?receiver={{ .Receiver }}&filter=%7B{{ range .CommonLabels.SortedPairs }}{{ .Name }}%3D"{{ .Value }}"%2C%20{{ end }}severity%3D"{{ index .CommonLabels "severity" }}"%7D{{ end }}

          {{ define "alertmanager.slack.text" }}
          {{ range .Alerts }}
          {{ if .Annotations.description }}
          {{ .Annotations.description }}
          {{ end }}

          {{ end }}
          {{ end }}',
      }

But i do not seem to be getting any slack alerts. Is there some config i have missed ?

Jason Stanley
  • 169
  • 1
  • 10

0 Answers0