Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit 6f15520cea6a1098915c9ca340dbe42de6a5de1d
Author: Tim Pansino <[email protected]>
Date:   Mon May 15 14:28:50 2023 -0700

    TEMP

commit 1a28d36
Author: Tim Pansino <[email protected]>
Date:   Thu May 11 17:28:27 2023 -0700

    Cover tags as list not dict

commit 71261e3
Merge: 459e085 c2d4629
Author: Timothy Pansino <[email protected]>
Date:   Thu May 11 16:59:11 2023 -0700

    Merge branch 'main' into feature-dimensional-metrics

commit 459e085
Author: Tim Pansino <[email protected]>
Date:   Thu May 11 16:57:16 2023 -0700

    Add testing for dimensional metrics

commit ed33957
Author: Tim Pansino <[email protected]>
Date:   Thu May 11 16:56:31 2023 -0700

    Add attribute processing to metric identity

commit 6caf71e
Author: Tim Pansino <[email protected]>
Date:   Thu May 11 16:56:16 2023 -0700

    Add dimensional stats table to stats engine

commit 5e1cc9d
Author: Tim Pansino <[email protected]>
Date:   Wed May 10 16:00:42 2023 -0700

    Squashed commit of the following:

    commit c2d4629
    Author: Timothy Pansino <[email protected]>
    Date:   Wed May 10 15:59:13 2023 -0700

        Add required option for tox v4 (#795)

        * Add required option for tox v4

        * Update tox in GHA

        * Remove py27 no-cache-dir

    commit a963649
    Author: Hannah Stepanek <[email protected]>
    Date:   Tue May 9 10:46:39 2023 -0700

        Run coverage around pytest (#813)

        * Run coverage around pytest

        * Trigger tests

        * Fixup

        * Add redis client_no_touch to ignore list

        * Temporarily remove kafka from coverage

        * Remove coverage for old libs

    commit 3d82845
    Author: Lalleh Rafeei <[email protected]>
    Date:   Wed May 3 14:50:30 2023 -0700

        Omit some frameworks from coverage analysis (#810)

        * Omit some frameworks from coverage analysis

        * Remove commas

        * Change format of omit

        * Add relative_files option to coverage

        * Add absolute directory

        * Add envsitepackagedir

        * Add coveragerc file

        * Add codecov.yml

        * [Mega-Linter] Apply linters fixes

        * Revert coveragerc file settings

        * Add files in packages and more frameworks

        * Remove commented line

        ---------

        Co-authored-by: lrafeei <[email protected]>
        Co-authored-by: Hannah Stepanek <[email protected]>

    commit fd0fa35
    Author: Uma Annamalai <[email protected]>
    Date:   Tue May 2 10:55:36 2023 -0700

        Add testing for genshi and mako. (#799)

        * Add testing for genshi and mako.

        * [Mega-Linter] Apply linters fixes

        ---------

        Co-authored-by: umaannamalai <[email protected]>
        Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>

    commit be4fb3d
    Author: Lalleh Rafeei <[email protected]>
    Date:   Mon May 1 16:01:09 2023 -0700

        Add tests for Waitress (#797)

        * Change import format

        * Initial commit

        * Add more tests to adapter_waitress

        * Remove commented out code

        * [Mega-Linter] Apply linters fixes

        * Add assertions to all tests

        * Add more NR testing to waitress

        ---------

        Co-authored-by: lrafeei <[email protected]>
        Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>

    commit 7103506
    Author: Hannah Stepanek <[email protected]>
    Date:   Mon May 1 14:12:31 2023 -0700

        Add tests for pyodbc (#796)

        * Add tests for pyodbc

        * Move imports into tests to get import coverage

        * Fixup: remove time import

        * Trigger tests

        ---------

        Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>

    commit 363122a
    Author: Hannah Stepanek <[email protected]>
    Date:   Mon May 1 13:34:35 2023 -0700

        Pin virtualenv, fix pip arg deprecation & disable kafka tests (#803)

        * Pin virtualenv

        * Fixup: use 20.21.1 instead

        * Replace install-options with config-settings

        See pypa/pip#11358.

        * Temporarily disable kafka tests

commit c2d4629
Author: Timothy Pansino <[email protected]>
Date:   Wed May 10 15:59:13 2023 -0700

    Add required option for tox v4 (#795)

    * Add required option for tox v4

    * Update tox in GHA

    * Remove py27 no-cache-dir

commit a963649
Author: Hannah Stepanek <[email protected]>
Date:   Tue May 9 10:46:39 2023 -0700

    Run coverage around pytest (#813)

    * Run coverage around pytest

    * Trigger tests

    * Fixup

    * Add redis client_no_touch to ignore list

    * Temporarily remove kafka from coverage

    * Remove coverage for old libs

commit dc81a50
Author: Tim Pansino <[email protected]>
Date:   Sat May 6 14:16:14 2023 -0700

    Wiring dimensional metrics

commit 3d82845
Author: Lalleh Rafeei <[email protected]>
Date:   Wed May 3 14:50:30 2023 -0700

    Omit some frameworks from coverage analysis (#810)

    * Omit some frameworks from coverage analysis

    * Remove commas

    * Change format of omit

    * Add relative_files option to coverage

    * Add absolute directory

    * Add envsitepackagedir

    * Add coveragerc file

    * Add codecov.yml

    * [Mega-Linter] Apply linters fixes

    * Revert coveragerc file settings

    * Add files in packages and more frameworks

    * Remove commented line

    ---------

    Co-authored-by: lrafeei <[email protected]>
    Co-authored-by: Hannah Stepanek <[email protected]>

commit fd0fa35
Author: Uma Annamalai <[email protected]>
Date:   Tue May 2 10:55:36 2023 -0700

    Add testing for genshi and mako. (#799)

    * Add testing for genshi and mako.

    * [Mega-Linter] Apply linters fixes

    ---------

    Co-authored-by: umaannamalai <[email protected]>
    Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>

commit be4fb3d
Author: Lalleh Rafeei <[email protected]>
Date:   Mon May 1 16:01:09 2023 -0700

    Add tests for Waitress (#797)

    * Change import format

    * Initial commit

    * Add more tests to adapter_waitress

    * Remove commented out code

    * [Mega-Linter] Apply linters fixes

    * Add assertions to all tests

    * Add more NR testing to waitress

    ---------

    Co-authored-by: lrafeei <[email protected]>
    Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>

commit 7103506
Author: Hannah Stepanek <[email protected]>
Date:   Mon May 1 14:12:31 2023 -0700

    Add tests for pyodbc (#796)

    * Add tests for pyodbc

    * Move imports into tests to get import coverage

    * Fixup: remove time import

    * Trigger tests

    ---------

    Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
  • Loading branch information
TimPansino committed Jun 7, 2023
1 parent ac6f712 commit d93e016
Show file tree
Hide file tree
Showing 14 changed files with 588 additions and 6 deletions.
8 changes: 8 additions & 0 deletions newrelic/api/application.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,14 @@ def record_custom_metrics(self, metrics):
if self.active and metrics:
self._agent.record_custom_metrics(self._name, metrics)

def record_dimensional_metric(self, name, value, tags=None):
if self.active:
self._agent.record_dimensional_metric(self._name, name, value, tags)

def record_dimensional_metrics(self, metrics):
if self.active and metrics:
self._agent.record_dimensional_metrics(self._name, metrics)

def record_custom_event(self, event_type, params):
if self.active:
self._agent.record_custom_event(self._name, event_type, params)
Expand Down
52 changes: 51 additions & 1 deletion newrelic/api/transaction.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@
from newrelic.core.custom_event import create_custom_event
from newrelic.core.log_event_node import LogEventNode
from newrelic.core.stack_trace import exception_stack
from newrelic.core.stats_engine import CustomMetrics, SampledDataSet
from newrelic.core.stats_engine import CustomMetrics, DimensionalMetrics, SampledDataSet
from newrelic.core.thread_utilization import utilization_tracker
from newrelic.core.trace_cache import (
TraceCacheActiveTraceError,
Expand Down Expand Up @@ -309,6 +309,7 @@ def __init__(self, application, enabled=None, source=None):
self.synthetics_header = None

self._custom_metrics = CustomMetrics()
self._dimensional_metrics = DimensionalMetrics()

global_settings = application.global_settings

Expand Down Expand Up @@ -591,6 +592,7 @@ def __exit__(self, exc, value, tb):
apdex_t=self.apdex,
suppress_apdex=self.suppress_apdex,
custom_metrics=self._custom_metrics,
dimensional_metrics=self._dimensional_metrics,
guid=self.guid,
cpu_time=self._cpu_user_time_value,
suppress_transaction_trace=self.suppress_transaction_trace,
Expand Down Expand Up @@ -1597,6 +1599,16 @@ def record_custom_metrics(self, metrics):
for name, value in metrics:
self._custom_metrics.record_custom_metric(name, value)

def record_dimensional_metric(self, name, value, tags=None):
self._dimensional_metrics.record_dimensional_metric(name, value, tags)

def record_dimensional_metrics(self, metrics):
for metric in metrics:
name, value = metric[:2]
tags = metric[2] if len(metric) >= 3 else None

self._dimensional_metrics.record_dimensional_metric(name, value, tags)

def record_custom_event(self, event_type, params):
settings = self._settings

Expand Down Expand Up @@ -1908,6 +1920,44 @@ def record_custom_metrics(metrics, application=None):
application.record_custom_metrics(metrics)


def record_dimensional_metric(name, value, tags=None, application=None):
if application is None:
transaction = current_transaction()
if transaction:
transaction.record_dimensional_metric(name, value, tags)
else:
_logger.debug(
"record_dimensional_metric has been called but no "
"transaction was running. As a result, the following metric "
"has not been recorded. Name: %r Value: %r Tags: %r. To correct this "
"problem, supply an application object as a parameter to this "
"record_dimensional_metrics call.",
name,
value,
tags,
)
elif application.enabled:
application.record_dimensional_metric(name, value, tags)


def record_dimensional_metrics(metrics, application=None):
if application is None:
transaction = current_transaction()
if transaction:
transaction.record_dimensional_metrics(metrics)
else:
_logger.debug(
"record_dimensional_metrics has been called but no "
"transaction was running. As a result, the following metrics "
"have not been recorded: %r. To correct this problem, "
"supply an application object as a parameter to this "
"record_dimensional_metric call.",
list(metrics),
)
elif application.enabled:
application.record_dimensional_metrics(metrics)


def record_custom_event(event_type, params, application=None):
"""Record a custom event.
Expand Down
37 changes: 37 additions & 0 deletions newrelic/common/metric_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Copyright 2010 New Relic, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""
This module implements functions for creating a unique identity from a name and set of tags for use in dimensional metrics.
"""

from newrelic.core.attribute import process_user_attribute


def create_metric_identity(name, tags=None):
if tags:
# Convert dicts to an iterable of tuples, other iterables should already be in this form
if isinstance(tags, dict):
tags = tags.items()

# Apply attribute system sanitization.
# process_user_attribute returns (None, None) for results that fail sanitization.
# The filter removes these results fromt he iterable before creating the frozenset.
tags = frozenset(filter(lambda args: args[0] is not None, map(lambda args: process_user_attribute(*args), tags)))

# Set empty iterables after filtering to None
if not tags and tags is not None:
tags = None

return (name, tags)
27 changes: 27 additions & 0 deletions newrelic/core/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -524,6 +524,33 @@ def record_custom_metrics(self, app_name, metrics):

application.record_custom_metrics(metrics)

def record_dimensional_metric(self, app_name, name, value, tags=None):
"""Records a basic metric for the named application. If there has
been no prior request to activate the application, the metric is
discarded.
"""

application = self._applications.get(app_name, None)
if application is None or not application.active:
return

application.record_dimensional_metric(name, value, tags)

def record_dimensional_metrics(self, app_name, metrics):
"""Records the metrics for the named application. If there has
been no prior request to activate the application, the metric is
discarded. The metrics should be an iterable yielding tuples
consisting of the name and value.
"""

application = self._applications.get(app_name, None)
if application is None or not application.active:
return

application.record_dimensional_metrics(metrics)

def record_custom_event(self, app_name, event_type, params):
application = self._applications.get(app_name, None)
if application is None or not application.active:
Expand Down
50 changes: 50 additions & 0 deletions newrelic/core/application.py
Original file line number Diff line number Diff line change
Expand Up @@ -510,6 +510,9 @@ def connect_to_data_collector(self, activate_agent):
with self._stats_custom_lock:
self._stats_custom_engine.reset_stats(configuration)

with self._stats_lock:
self._stats_engine.reset_stats(configuration)

# Record an initial start time for the reporting period and
# clear record of last transaction processed.

Expand Down Expand Up @@ -860,6 +863,50 @@ def record_custom_metrics(self, metrics):
self._global_events_account += 1
self._stats_custom_engine.record_custom_metric(name, value)

def record_dimensional_metric(self, name, value, tags=None):
"""Record a dimensional metric against the application independent
of a specific transaction.
NOTE that this will require locking of the stats engine for
dimensional metrics and so under heavy use will have performance
issues. It is better to record the dimensional metric against an
active transaction as they will then be aggregated at the end of
the transaction when all other metrics are aggregated and so no
additional locking will be required.
"""

if not self._active_session:
return

with self._stats_lock:
self._global_events_account += 1
self._stats_engine.record_dimensional_metric(name, value, tags)

def record_dimensional_metrics(self, metrics):
"""Record a set of dimensional metrics against the application
independent of a specific transaction.
NOTE that this will require locking of the stats engine for
dimensional metrics and so under heavy use will have performance
issues. It is better to record the dimensional metric against an
active transaction as they will then be aggregated at the end of
the transaction when all other metrics are aggregated and so no
additional locking will be required.
"""

if not self._active_session:
return

with self._stats_lock:
for metric in metrics:
name, value = metric[:2]
tags = metric[2] if len(metric) >= 3 else None

self._global_events_account += 1
self._stats_engine.record_dimensional_metric(name, value, tags)

def record_custom_event(self, event_type, params):
if not self._active_session:
return
Expand Down Expand Up @@ -1452,11 +1499,14 @@ def harvest(self, shutdown=False, flexible=False):
_logger.debug("Normalizing metrics for harvest of %r.", self._app_name)

metric_data = stats.metric_data(metric_normalizer)
dimensional_metric_data = stats.dimensional_metric_data(metric_normalizer)

_logger.debug("Sending metric data for harvest of %r.", self._app_name)

# Send metrics
self._active_session.send_metric_data(self._period_start, period_end, metric_data)
if dimensional_metric_data:
self._active_session.send_dimensional_metric_data(self._period_start, period_end, dimensional_metric_data)

_logger.debug("Done sending data for harvest of %r.", self._app_name)

Expand Down
21 changes: 21 additions & 0 deletions newrelic/core/data_collector.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,8 @@

_logger = logging.getLogger(__name__)

DIMENSIONAL_METRIC_DATA_TEMP = [] # TODO: REMOVE THIS


class Session(object):
PROTOCOL = AgentProtocol
Expand Down Expand Up @@ -143,6 +145,25 @@ def send_metric_data(self, start_time, end_time, metric_data):
payload = (self.agent_run_id, start_time, end_time, metric_data)
return self._protocol.send("metric_data", payload)

def send_dimensional_metric_data(self, start_time, end_time, metric_data):
"""Called to submit dimensional metric data for specified period of time.
Time values are seconds since UNIX epoch as returned by the
time.time() function. The metric data should be iterable of
specific metrics.
NOTE: This data is sent not sent to the normal agent endpoints but is sent
to the MELT API endpoints to keep the entity separate. This is for use
with the machine learning integration only.
"""

payload = (self.agent_run_id, start_time, end_time, metric_data)
# return self._protocol.send("metric_data", payload)

# TODO: REMOVE THIS. Replace with actual protocol.
DIMENSIONAL_METRIC_DATA_TEMP.append(payload)
_logger.debug("Dimensional Metrics: %r" % metric_data)
return 200

def send_log_events(self, sampling_info, log_event_data):
"""Called to submit sample set for log events."""

Expand Down
Loading

0 comments on commit d93e016

Please sign in to comment.