diff --git a/CHANGELOG.md b/CHANGELOG.md index dd9e6a70f..7e1c3a676 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,36 @@ # Changelog +## [3.4.5] - 2022-11-03 + +### Added + +* ARGO-3971 Store default ports in POEM +* ARGO-3980 Support bulk delete of service types +* ARGO-3981 Introduce bulk add service types view +* ARGO-3983 Add service types pagination +* ARGO-4009 Enable Hot Module Replacement and Django internal web server +* ARGO-4014 Define min and max width of name column in Service Types list + +### Changed + +* ARGO-3711 Reduce Metric model to what is strictly necessary +* ARGO-3982 Remove service types sync and related DB tables +* ARGO-4004 Ensure that only service type names without whitespace can created from the UI +* ARGO-4008 Have red border around changed description field +* ARGO-4012 One common Save button instead of one in every row +* ARGO-4031 Switch API key page to react-hook-form library +* ARGO-4032 Switch groups page to react-hook-form library +* ARGO-4039 Switch package page to react-hook-form library +* ARGO-4051 Ensure POEM wheel essential dependencies installed automatically + +### Fixed + +* ARGO-4005 Marking field tuple on filtered view ends with description fields populated from neighboring tuple +* ARGO-4006 Case insensitive search on bulk delete and placeholder missing +* ARGO-4028 Adding of new metric profile is broken +* ARGO-4094 Metric parameter overrides not handling space in parameter value +* ARGO-4095 Clean requirements.txt + ## [3.4.4] - 2022-09-01 ### Added diff --git a/README.md b/README.md index bf7caeb4a..8c1bbc516 100644 --- a/README.md +++ b/README.md @@ -28,7 +28,7 @@ Web application is served with Apache web server and uses PostgreSQL as database Backend: `Django`, `dj-rest-auth`, `django-tenants`, `django-webpack-loader`, `djangorestframework`, `djangorestframework-api-key`, `djangosaml2`, `psycopg2-binary` -Frontend: `ReactJS`, `formik`, `react-autosuggest`, `react-diff-viewer`, `react-dom`, `react-fontawesome`, `react-helmet`, `react-notifications`, `react-popup`, `react-router`, `react-select`, `react-table`, `reactstrap`, `webpack`, `yup` +Frontend: `ReactJS`, `formik`, `react-hook-form`, `react-autosuggest`, `react-diff-viewer`, `react-dom`, `react-fontawesome`, `react-helmet`, `react-notifications`, `react-popup`, `react-router`, `react-select`, `react-table`, `reactstrap`, `webpack`, `yup` ## User documentation and instances @@ -246,6 +246,9 @@ Part of the REST API is protected by token so for tenants that consume those API ReportsTopologyTags = https://api.devel.argo.grnet.gr/api/v2/topology/tags ReportsTopologyGroups = https://api.devel.argo.grnet.gr/api/v2/topology/groups ReportsTopologyEndpoints = https://api.devel.argo.grnet.gr/api/v2/topology/endpoints + ServiceTypes = https://api.devel.argo.grnet.gr/api/v2/topology/service-types + Metrics = https://api.devel.argo.grnet.gr/api/v4/admin/metrics + This section lists WEB-API methods for the resources that are not stored in POEM's PostgreSQL DB, but instead are consumed from ARGO WEB-API services. POEM @@ -435,13 +438,49 @@ Starting of multi-container application: docker/ $ docker-compose up ``` +### Web server + +In development enviroment, application can be served via Apache web server or internal Django web server coupled with Webpack's dev server for the Hot Module Reload functionality (HMR). Helper make target rules are provided in [poem/Poem/Makefile](poem/Poem/Makefile). + +#### Apache + +For the Apache web serving, bundle created by the Webpack need to be manually planted as Django's staticfile everytime bundle is recreated. Webpack can monitor the changes in the React's code and recreate the bundle on the fly. + +Start Webpack's watch mode: +``` +make devel-watch +``` + +After changes are done and developer wants to see how they are reflected, he needs to place newly created bundle as Django staticfile and restart the Apache within container enviroment. For that purpose, make target rule is prepared: +``` +make place-new-bundle +``` + +#### Django web server + +Advantage of using Django's web server is that backend code can be easily debugged as developer can use debugger and breakpoints and trace the execution of code. Django web server is automatically reloaded for every change of the backend code. Webpack's bundles are automatically placed as they are picked up from `webpack-dev-server` running at `localhost:3000`. Moreover, developer can use HMR functionality as `webpack-dev-server` is able to trigger browser reload for every change in the frontend code. + +Start Django web server at `0.0.0.0:8000`: +``` +make devel-django-server +``` + +Start `webpack-dev-server` as `localhost:3000`: +``` +make devel-webpack-server +``` +Or use HMR: +``` +make devel-webpack-server-hmr +``` + ### Packaging Deployment of new versions is done with wheel packages that contain both backend Python and frontend Javascript code. Packages are build using setuptools and helper make target rules are provided in [Makefile](Makefile) and in [poem/Poem/Makefile](poem/Poem/Makefile). Latter is used to create a devel or production bundle of frontend Javascript code and place it as Django staticfiles, while the former is used to create Python wheel package. * frontend `Makefile` package targets: - - `make devel` - create a development Webpack bundle - - `make prod` - create a production Webpack bundle + - `make devel-bundle` - create a development Webpack bundle + - `make prod-bundle` - create a production Webpack bundle - `make place-new-bundle` - place created bundle as Django staticfile * backend `Makefile` package targets: - `make wheel-devel` - create date-tagged wheel package diff --git a/bin/poem-importservices b/bin/poem-importservices deleted file mode 100755 index a21f0c941..000000000 --- a/bin/poem-importservices +++ /dev/null @@ -1,43 +0,0 @@ -#!/bin/sh - -RUNASUSER="apache" -SITEPACK=$(python -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())") - -usage() -{ - printf "Usage: %s -f -s \n" $(basename $0) >&2 - printf " [-f] - JSON file with services\n" >&2 - printf " [-s] - schema\n" >&2 - exit 2 -} - -if [[ $# == 0 ]] -then - usage -fi - -while getopts 'f:s:h' OPTION -do - case $OPTION in - f) - jsonfile=$OPTARG - ;; - s) - schema=$OPTARG - ;; - h) - usage - ;; - ?) - usage - ;; - esac -done - -if [[ ! -z "$jsonfile" && ! -z "$schema" ]] -then - su -m -s /bin/sh $RUNASUSER -c \ - "poem-manage import_services --json $jsonfile --schema $schema" -else - usage -fi diff --git a/bin/poem-syncservtype b/bin/poem-syncservtype deleted file mode 100755 index 058410005..000000000 --- a/bin/poem-syncservtype +++ /dev/null @@ -1,14 +0,0 @@ -#!/bin/sh - -RUNASUSER="apache" -SITEPACK=$(python -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())") - -if (( $# > 0 )) -then - echo -e "\nScript does not take any argument and should be called without any\n" - exit 1 -fi - -su -m -s /bin/sh $RUNASUSER -c \ -"export DJANGO_SETTINGS_MODULE=Poem.settings REQUESTS_CA_BUNDLE=/etc/pki/tls/certs/ca-bundle.crt && \ -python $SITEPACK/Poem/sync/poem-syncservtype.py" diff --git a/bin/poem-tenant b/bin/poem-tenant index 32fe43e68..d6e180a83 100755 --- a/bin/poem-tenant +++ b/bin/poem-tenant @@ -69,8 +69,6 @@ su -m -s /bin/sh $RUNASUSER -c \ if [[ $name != 'all' ]] then - su -m -s /bin/sh $RUNASUSER -c \ - "poem-manage tenant_command loaddata initial_data.json --schema=$schema" su -m -s /bin/sh $RUNASUSER -c \ "poem-manage tenant_command import_internal_metrics --schema=$schema" fi diff --git a/cron/poem-sync b/cron/poem-sync deleted file mode 100644 index ab18a0592..000000000 --- a/cron/poem-sync +++ /dev/null @@ -1 +0,0 @@ -50 * * * * root source /etc/profile.d/venv_poem.sh; workon poem; $VIRTUAL_ENV/bin/poem-syncservtype diff --git a/docker/Dockerfile b/docker/Dockerfile index 48b1b2eea..6f8e6e2ca 100644 --- a/docker/Dockerfile +++ b/docker/Dockerfile @@ -51,8 +51,7 @@ WORKDIR poem-2-devel RUN source /etc/profile.d/venv_poem.sh; \ workon poem; \ make wheel-devel; \ - pip3 install dist/*; \ - pip3 install -r requirements.txt + pip3 install dist/*; ADD https://cacerts.digicert.com/TERENAeScienceSSLCA3.crt.pem /etc/pki/tls/certs/ RUN yum -y install ca-policy-egi-core; \ rm -rf /etc/pki/ca-trust/source/anchors/*; \ diff --git a/docker/docker-compose.yml b/docker/docker-compose.yml index bc0bb9ad4..cdfbd8dba 100755 --- a/docker/docker-compose.yml +++ b/docker/docker-compose.yml @@ -9,6 +9,8 @@ services: hostname: poem-react tty: true stdin_open: true + ports: + - "8000:8000" depends_on: - db-poem volumes: @@ -19,6 +21,7 @@ services: - ../docker/syncsite.sh:/home/user/syncsite.sh - ../docker/safety.sh:/home/user/safety.sh - ../docker/collectstatic.sh:/home/user/collectstatic.sh + - ../docker/run-django-server.sh:/home/user/run-django-server.sh - ../docker/restarthttpd.sh:/home/user/restarthttpd.sh - ../:/home/user/poem-source - ../poem/Poem/:$VENV/lib/python3.6/site-packages/Poem @@ -31,10 +34,8 @@ services: - ../bin/poem-clearsessions:$VENV/bin/poem-clearsessions - ../bin/poem-manage:$VENV/bin/poem-manage - ../bin/poem-genseckey:$VENV/bin/poem-genseckey - - ../bin/poem-syncservtype:$VENV/bin/poem-syncservtype - ../bin/poem-tenant:$VENV/bin/poem-tenant - ../bin/poem-token:$VENV/bin/poem-token - - ../bin/poem-importservices:$VENV/bin/poem-importservices networks: app_net: ipv4_address: 172.19.0.2 diff --git a/docker/poem.conf b/docker/poem.conf index f206ff795..f829e0ec0 100644 --- a/docker/poem.conf +++ b/docker/poem.conf @@ -27,7 +27,8 @@ Reports = https://api.devel.argo.grnet.gr/api/v2/reports ReportsTopologyTags = https://api.devel.argo.grnet.gr/api/v2/topology/tags ReportsTopologyGroups = https://api.devel.argo.grnet.gr/api/v2/topology/groups ReportsTopologyEndpoints = https://api.devel.argo.grnet.gr/api/v2/topology/endpoints -ReportsCrud = True +ServiceTypes = https://api.devel.argo.grnet.gr/api/v2/topology/service-types +Metrics = https://api.devel.argo.grnet.gr/api/v2/admin/metrics [GENERAL_ALL] diff --git a/docker/run-django-server.sh b/docker/run-django-server.sh new file mode 100755 index 000000000..e5d1eea57 --- /dev/null +++ b/docker/run-django-server.sh @@ -0,0 +1,7 @@ +#!/bin/bash + +RUNASROOT="su -m -s /bin/bash root -c" +PIDS=$(pgrep -f runserver | tr '\n' ' ') +$RUNASROOT "kill -9 $PIDS 2>/dev/null" + +$RUNASROOT ". /home/pyvenv/poem/bin/activate && poem-manage runserver --settings=Poem.settings-devserver 0.0.0.0:8000" diff --git a/docker/setup.sh b/docker/setup.sh index 0c4cf01fe..c448f03db 100755 --- a/docker/setup.sh +++ b/docker/setup.sh @@ -10,4 +10,3 @@ cp -f poem.conf $VIRTUAL_ENV/etc/poem/poem.conf chown -R apache:apache $VIRTUAL_ENV ln -f -s $VIRTUAL_ENV/etc/httpd/conf.d/poem.conf /opt/rh/httpd24/root/etc/httpd/conf.d/ ln -f -s $VIRTUAL_ENV/etc/cron.d/poem-clearsessions /etc/cron.d/ -ln -f -s $VIRTUAL_ENV/etc/cron.d/poem-sync /etc/cron.d/ diff --git a/etc/poem.conf.template b/etc/poem.conf.template index f070cdee1..9c390dc16 100644 --- a/etc/poem.conf.template +++ b/etc/poem.conf.template @@ -27,6 +27,7 @@ Reports = https://api.devel.argo.grnet.gr/api/v2/reports ReportsTopologyTags = https://api.devel.argo.grnet.gr/api/v2/topology/tags ReportsTopologyGroups = https://api.devel.argo.grnet.gr/api/v2/topology/groups ReportsTopologyEndpoints = https://api.devel.argo.grnet.gr/api/v2/topology/endpoints +ServiceTypes = https://api.devel.argo.grnet.gr/api/v2/topology/service-types Metrics = https://api.devel.argo.grnet.gr/api/v2/admin/metrics [GENERAL_ALL] diff --git a/helpers/metric_json_helper.py b/helpers/metric_json_helper.py new file mode 100755 index 000000000..122ef2a26 --- /dev/null +++ b/helpers/metric_json_helper.py @@ -0,0 +1,62 @@ +#!/usr/bin/python3 + +import json +import argparse +import os + + +def main(): + parser = argparse.ArgumentParser( + "Convert old metric format to new for all the files in the given " + "directory; newly created files will have suffix '-new'" + ) + parser.add_argument( + "-d", "--directory", dest="directory", required=True, type=str, + help="directory with the dumped json files" + ) + args = parser.parse_args() + + files = [ + os.path.join(args.directory, f) for f in os.listdir(args.directory) if + os.path.isfile(os.path.join(args.directory, f)) + ] + + for file in files: + with open(file, "r") as f: + data = json.load(f) + + changed = False + new_data = [item for item in data if item["model"] != "poem.metrictype"] + for item in new_data: + if item["model"] == "poem.metric": + if item["fields"]["probekey"]: + item["fields"]["probeversion"] = \ + f"{item['fields']['probekey'][0]} " \ + f"({item['fields']['probekey'][1]})" + + else: + item["fields"]["probeversion"] = None + + item["fields"].pop("mtype") + item["fields"].pop("probekey") + item["fields"].pop("description") + item["fields"].pop("parent") + item["fields"].pop("probeexecutable") + item["fields"].pop("attribute") + item["fields"].pop("dependancy") + item["fields"].pop("flags") + item["fields"].pop("files") + item["fields"].pop("parameter") + item["fields"].pop("fileparameter") + + changed = True + + if changed: + extension = file.split(".")[-1] + new_name = file.split(f".{extension}")[0] + f"-new.{extension}" + + with open(new_name, "w") as f: + json.dump(new_data, f, indent=4) + + +main() diff --git a/poem/Poem/Makefile b/poem/Poem/Makefile index 6ad496d52..58894d71d 100644 --- a/poem/Poem/Makefile +++ b/poem/Poem/Makefile @@ -9,11 +9,20 @@ devel-watch: rm -rf frontend/bundles/reactbundle/* ; \ node_modules/.bin/webpack --config webpack.config.js --progress --mode development --watch -devel: +devel-django-server: + docker exec -ti $(container_name) ./run-django-server.sh + +devel-webpack-server: + node_modules/.bin/webpack-dev-server --config webpack.devserver.config.js --progress + +devel-webpack-server-hmr: + node_modules/.bin/webpack-dev-server --config webpack.devserver.config.js --progress --hot + +devel-bundle: rm -rf frontend/bundles/reactbundle/* ; \ node_modules/.bin/webpack --config webpack.config.js --progress --mode development -prod: +prod-bundle: rm -rf frontend/bundles/reactbundle/* ; \ node_modules/.bin/webpack --config webpack.config.js --progress --mode production diff --git a/poem/Poem/api/internal_views/app.py b/poem/Poem/api/internal_views/app.py index 8b843c3dd..49bd387a6 100644 --- a/poem/Poem/api/internal_views/app.py +++ b/poem/Poem/api/internal_views/app.py @@ -107,6 +107,7 @@ def get(self, request): options.update(webapiaggregation=settings.WEBAPI_AGGREGATION) options.update(webapithresholds=settings.WEBAPI_THRESHOLDS) options.update(webapioperations=settings.WEBAPI_OPERATIONS) + options.update(webapiservicetypes=settings.WEBAPI_SERVICETYPES) options.update(version=version) options.update(webapireports=dict( main=settings.WEBAPI_REPORTS, diff --git a/poem/Poem/api/internal_views/metricprofiles.py b/poem/Poem/api/internal_views/metricprofiles.py index 76d1ffbc8..857fd5a0c 100644 --- a/poem/Poem/api/internal_views/metricprofiles.py +++ b/poem/Poem/api/internal_views/metricprofiles.py @@ -13,22 +13,6 @@ from .utils import error_response -class ListAllServiceFlavours(APIView): - authentication_classes = (SessionAuthentication,) - - def get(self, request): - service_flavours = poem_models.ServiceFlavour.objects.all() - serializer = serializers.ServiceFlavourSerializer( - service_flavours, many=True - ) - data = [] - for item in serializer.data: - data.append( - {'name': item['name'], 'description': item['description']} - ) - return Response(sorted(data, key=lambda k: k['name'].lower())) - - class ListMetricProfiles(APIView): authentication_classes = (SessionAuthentication,) diff --git a/poem/Poem/api/internal_views/metrics.py b/poem/Poem/api/internal_views/metrics.py index d30a7a6fe..37b7b6781 100644 --- a/poem/Poem/api/internal_views/metrics.py +++ b/poem/Poem/api/internal_views/metrics.py @@ -53,43 +53,49 @@ def get(self, request, name=None): results = [] for metric in metrics: - config = two_value_inline(metric.config) - parent = one_value_inline(metric.parent) - probeexecutable = one_value_inline(metric.probeexecutable) - attribute = two_value_inline(metric.attribute) - dependancy = two_value_inline(metric.dependancy) - flags = two_value_inline(metric.flags) - files = two_value_inline(metric.files) - parameter = two_value_inline(metric.parameter) - fileparameter = two_value_inline(metric.fileparameter) - - if metric.probekey: - probeversion = metric.probekey.__str__() + if metric.probeversion: + metric_probe = metric.probeversion.split(" (") + probe_name = metric_probe[0] + probe_version = metric_probe[1][:-1] + + mt = admin_models.MetricTemplateHistory.objects.get( + name=metric.name, probekey__name=probe_name, + probekey__package__version=probe_version + ) + else: - probeversion = '' + mt = admin_models.MetricTemplateHistory.objects.get( + name=metric.name + ) + + config = two_value_inline(metric.config) + parent = one_value_inline(mt.parent) + probeexecutable = one_value_inline(mt.probeexecutable) + attribute = two_value_inline(mt.attribute) + dependency = two_value_inline(mt.dependency) + flags = two_value_inline(mt.flags) + files = two_value_inline(mt.files) + parameter = two_value_inline(mt.parameter) + fileparameter = two_value_inline(mt.fileparameter) if metric.group: group = metric.group.name else: group = '' - mt = admin_models.MetricTemplateHistory.objects.get( - name=metric.name, probekey=metric.probekey - ) - results.append(dict( id=metric.id, name=metric.name, - mtype=metric.mtype.name, + mtype=mt.mtype.name, tags=[tag.name for tag in mt.tags.all()], - probeversion=probeversion, + probeversion=metric.probeversion if metric.probeversion else "", group=group, - description=metric.description, + description=mt.description, parent=parent, probeexecutable=probeexecutable, config=config, attribute=attribute, - dependancy=dependancy, + dependancy=dependency, flags=flags, files=files, parameter=parameter, @@ -126,76 +132,44 @@ def put(self, request): ) else: - if request.data['parent']: - parent = json.dumps([request.data['parent']]) - else: - parent = '' - - if request.data['probeexecutable']: - probeexecutable = json.dumps( - [request.data['probeexecutable']] - ) - else: - probeexecutable = '' - - if request.data['description']: - description = request.data['description'] - else: - description = '' - metric_group = poem_models.GroupOfMetrics.objects.get( name=request.data['group'] ) - metric_type = poem_models.MetricType.objects.get( - name=request.data['mtype'] - ) - user_perm = request.user.is_superuser or \ metric_group in userprofile.groupsofmetrics.all() if user_perm: - metric.name = request.data['name'] - metric.mtype = metric_type metric.group = poem_models.GroupOfMetrics.objects.get( name=request.data['group'] ) - metric.description = description - metric.parent = parent - metric.flags = inline_metric_for_db(request.data['flags']) if request.data['mtype'] == 'Active': - metric.probekey = admin_models.ProbeHistory.objects.get( - name=request.data['probeversion'].split(' ')[0], - package__version=request.data[ - 'probeversion' - ].split(' ')[1][1:-1] - ) - metric.probeexecutable = probeexecutable metric.config = inline_metric_for_db( request.data['config'] ) - metric.attribute = inline_metric_for_db( - request.data['attribute']) - metric.dependancy = inline_metric_for_db( - request.data['dependancy']) - metric.files = inline_metric_for_db( - request.data['files'] + metric.probeversion = request.data["probeversion"] + + metric_probe = request.data["probeversion"].split(" (") + probe_name = metric_probe[0].strip() + probe_version = metric_probe[1][:-1].strip() + + mt = admin_models.MetricTemplateHistory.objects.get( + name=metric.name, probekey__name=probe_name, + probekey__package__version=probe_version ) - metric.parameter = inline_metric_for_db( - request.data['parameter']) - metric.fileparameter = inline_metric_for_db( - request.data['fileparameter'] + + else: + mt = admin_models.MetricTemplateHistory.objects.get( + name=metric.name ) - metric.save() - mt = admin_models.MetricTemplateHistory.objects.get( - name=metric.name, probekey=metric.probekey - ) create_history( metric, request.user.username, tags=mt.tags.all() ) + metric.save() + return Response(status=status.HTTP_201_CREATED) else: @@ -223,12 +197,6 @@ def put(self, request): detail='Group of metrics does not exist.' ) - except poem_models.MetricType.DoesNotExist: - return error_response( - status_code=status.HTTP_404_NOT_FOUND, - detail='Metric type does not exist.' - ) - except KeyError as e: return error_response( status_code=status.HTTP_400_BAD_REQUEST, @@ -300,21 +268,6 @@ def delete(self, request, name): return self._denied() -class ListMetricTypes(APIView): - authentication_classes = (SessionAuthentication,) - - def get(self, request): - types = poem_models.MetricType.objects.all().values_list( - 'name', flat=True - ) - return Response(types) - - -class ListPublicMetricTypes(ListMetricTypes): - authentication_classes = () - permission_classes = () - - class ImportMetrics(APIView): authentication_classes = (SessionAuthentication,) @@ -439,63 +392,72 @@ def _handle_metrics( updated = [] profile_warning = [] for metric in poem_models.Metric.objects.all(): - if metric.probekey and \ - metric.probekey.package.name == package.name: - mts_history = \ - admin_models.MetricTemplateHistory.objects.filter( - name=metric.name - ) - if len(mts_history) > 0: - mts = admin_models.MetricTemplateHistory.objects.filter( - object_id=mts_history[0].object_id - ) - metrictemplate = None - for mt in mts: - if mt.probekey.package == package: - metrictemplate = mt - break - - if metrictemplate: - if not dry_run: - update_metric_in_schema( - mt_id=metrictemplate.id, - name=metric.name, - pk_id=metric.probekey.id, - schema=schema, - update_from_history=True, - user=user - ) - updated.append(metric.name) + if metric.probeversion: + metric_probe = metric.probeversion.split(" (") + probe_name = metric_probe[0].strip() + probe_version = metric_probe[1][:-1].strip() + probekey = admin_models.ProbeHistory.objects.filter( + name=probe_name, + package__version=probe_version + ) + if len(probekey) == 1 and \ + probekey[0].package.name == package.name: + mts_history = \ + admin_models.MetricTemplateHistory.objects.filter( + name=metric.name, + probekey__package__name=package.name + ) + if len(mts_history) > 0: + mts = admin_models.MetricTemplateHistory.objects.filter( + object_id=mts_history[0].object_id + ) + metrictemplate = None + for mt in mts: + if mt.probekey.package == package: + metrictemplate = mt + break + + if metrictemplate: + if not dry_run: + update_metric_in_schema( + mt_id=metrictemplate.id, + name=metric.name, + pk_id=probekey[0].id, + schema=schema, + update_from_history=True, + user=user + ) + updated.append(metric.name) - else: - if dry_run: - for key, value in metrics.items(): - if metric.name == key: - if len(value) == 1: - profile_warning.append( - 'Metric {} is part of {} ' - 'metric profile.'.format( - metric.name, value[0] + else: + if dry_run: + for key, value in metrics.items(): + if metric.name == key: + if len(value) == 1: + profile_warning.append( + 'Metric {} is part of {} ' + 'metric profile.'.format( + metric.name, value[0] + ) ) - ) - - else: - profile_warning.append( - 'Metric {} is part of {} ' - 'metric profiles.'.format( - metric.name, ', '.join( - value + + else: + profile_warning.append( + 'Metric {} is part of {} ' + 'metric profiles.'.format( + metric.name, ', '.join( + value + ) ) ) - ) - else: - metric.delete() + else: + metric.delete() - deleted_not_in_package.append(metric.name) + deleted_not_in_package.append(metric.name) - else: - warning_no_tbh.append(metric.name) + else: + warning_no_tbh.append(metric.name) msg = dict() if deleted_not_in_package: diff --git a/poem/Poem/api/internal_views/metrictemplates.py b/poem/Poem/api/internal_views/metrictemplates.py index 5e1e55b19..c28495e5b 100644 --- a/poem/Poem/api/internal_views/metrictemplates.py +++ b/poem/Poem/api/internal_views/metrictemplates.py @@ -1063,3 +1063,66 @@ class ListPublicMetricTags(ListMetricTags): class ListPublicMetricTemplates4Tag(ListMetricTemplates4Tag): authentication_classes = () permission_classes = () + + +class ListDefaultPorts(APIView): + authentication_classes = (SessionAuthentication,) + + def get(self, request): + ports = admin_models.DefaultPort.objects.all().order_by("name") + serializer = serializers.DefaultPortsSerializer(ports, many=True) + return Response(serializer.data) + + def post(self, request): + db_ports = admin_models.DefaultPort.objects.all().values_list( + "name", flat=True + ) + if request.tenant.schema_name == get_public_schema_name() and \ + request.user.is_superuser: + try: + if isinstance(request.data["ports"], str): + ports = json.loads(request.data["ports"]) + + else: + ports = request.data["ports"] + + for port in ports: + assert port["name"] + assert port["value"] + + except KeyError as e: + return error_response( + status_code=status.HTTP_400_BAD_REQUEST, + detail=f"Wrong JSON format: Missing key '{e.args[0]}'" + ) + + else: + for port in ports: + if port["name"] in db_ports: + stored_port = admin_models.DefaultPort.objects.get( + name=port["name"] + ) + if stored_port.value != port["value"]: + stored_port.value = port["value"] + stored_port.save() + + else: + admin_models.DefaultPort.objects.create( + name=port["name"], value=port["value"] + ) + + for name in set(db_ports).difference( + set([p["name"] for p in ports]) + ): + stored_port = admin_models.DefaultPort.objects.get( + name=name + ) + stored_port.delete() + + return Response(status=status.HTTP_201_CREATED) + + else: + return error_response( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="You do not have permission to add ports" + ) diff --git a/poem/Poem/api/internal_views/package.py b/poem/Poem/api/internal_views/package.py index 2c71f5cc8..92f59dea8 100644 --- a/poem/Poem/api/internal_views/package.py +++ b/poem/Poem/api/internal_views/package.py @@ -5,11 +5,11 @@ from Poem.poem_super_admin import models as admin_models from django.db import IntegrityError, connection from django.db.models import ProtectedError +from django_tenants.utils import get_public_schema_name from rest_framework import status from rest_framework.authentication import SessionAuthentication from rest_framework.response import Response from rest_framework.views import APIView -from django_tenants.utils import get_public_schema_name from .utils import error_response @@ -59,7 +59,7 @@ def get(self, request, nameversion=None): 'name': package.name, 'version': package.version, 'use_present_version': package.use_present_version, - 'repos': repos + 'repos': sorted(repos) } return Response(result) @@ -71,8 +71,14 @@ def get(self, request, nameversion=None): if connection.schema_name != get_public_schema_name(): packages = set() for metric in poem_models.Metric.objects.all(): - if metric.probekey: - packages.add(metric.probekey.package) + if metric.probeversion: + metric_probe = metric.probeversion.split("(") + probe_name = metric_probe[0].strip() + probe_version = metric_probe[1][:-1].strip() + probe = admin_models.ProbeHistory.objects.get( + name=probe_name, package__version=probe_version + ) + packages.add(probe.package) else: packages = admin_models.Package.objects.all() diff --git a/poem/Poem/api/internal_views/probes.py b/poem/Poem/api/internal_views/probes.py index cef4f4376..5da04db03 100644 --- a/poem/Poem/api/internal_views/probes.py +++ b/poem/Poem/api/internal_views/probes.py @@ -1,7 +1,4 @@ import datetime - -from django.db import IntegrityError - import json from Poem.api.views import NotFound @@ -9,13 +6,13 @@ from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models from Poem.tenants.models import Tenant - +from django.db import IntegrityError +from django_tenants.utils import schema_context, get_public_schema_name from rest_framework import status from rest_framework.authentication import SessionAuthentication from rest_framework.response import Response from rest_framework.views import APIView -from django_tenants.utils import schema_context, get_public_schema_name from .utils import error_response @@ -132,7 +129,6 @@ def put(self, request): history = admin_models.ProbeHistory.objects.filter( name=old_name, package__version=old_version ) - probekey = history[0] new_data = { 'name': request.data['name'], 'package': package, @@ -156,15 +152,20 @@ def put(self, request): }) history.update(**new_data) - # update Metric history in case probekey name has changed: + # update Metric history in case probe name has changed: if request.data['name'] != old_name: for schema in schemas: with schema_context(schema): metrics = poem_models.Metric.objects.filter( - probekey=probekey + probeversion=f"{old_name} ({old_version})" ) for metric in metrics: + metric.probeversion = \ + f"{request.data['name']} " \ + f"({old_version})" + metric.save() + vers = \ poem_models.TenantHistory.objects.filter( object_id=metric.id @@ -364,4 +365,3 @@ def post(self, request): def delete(self, request, name): return self._denied() - diff --git a/poem/Poem/api/internal_views/servicetypes.py b/poem/Poem/api/internal_views/servicetypes.py deleted file mode 100644 index 458e9d6ae..000000000 --- a/poem/Poem/api/internal_views/servicetypes.py +++ /dev/null @@ -1,21 +0,0 @@ -from Poem.poem.models import ServiceFlavour -from Poem.api import serializers - -from rest_framework import status -from rest_framework.authentication import SessionAuthentication -from rest_framework.response import Response -from rest_framework.views import APIView - - -class ListServiceTypesDescriptions(APIView): - authentication_classes = (SessionAuthentication,) - - def get(self, request, name=None): - serviceflavours = ServiceFlavour.objects.all().order_by('name') - serializer = serializers.ServiceFlavourSerializer(serviceflavours, many=True) - return Response(serializer.data) - - -class ListPublicServiceTypesDescriptions(ListServiceTypesDescriptions): - authentication_classes = () - permission_classes = () diff --git a/poem/Poem/api/internal_views/utils.py b/poem/Poem/api/internal_views/utils.py index 439f81d0a..8a0a7c4b6 100644 --- a/poem/Poem/api/internal_views/utils.py +++ b/poem/Poem/api/internal_views/utils.py @@ -211,17 +211,15 @@ def get_tenant_resources(schema_name): with schema_context(schema_name): if schema_name == get_public_schema_name(): metrics = admin_models.MetricTemplate.objects.all() + probes = [metric.probekey for metric in metrics if metric.probekey] met_key = 'metric_templates' else: metrics = poem_models.Metric.objects.all() + probes = [ + metric.probeversion for metric in metrics if metric.probeversion + ] met_key = 'metrics' n_met = metrics.count() - - probes = set() - for metric in metrics: - if metric.probekey: - probes.add(metric.probekey) - - n_probe = len(probes) + n_probe = len(set(probes)) return {met_key: n_met, 'probes': n_probe} diff --git a/poem/Poem/api/migrations/0002_auto_20221027_0718.py b/poem/Poem/api/migrations/0002_auto_20221027_0718.py new file mode 100644 index 000000000..48c92fa6b --- /dev/null +++ b/poem/Poem/api/migrations/0002_auto_20221027_0718.py @@ -0,0 +1,23 @@ +# Generated by Django 3.2.16 on 2022-10-27 07:18 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('api', '0001_initial'), + ] + + operations = [ + migrations.AlterField( + model_name='myapikey', + name='hashed_key', + field=models.CharField(editable=False, max_length=150), + ), + migrations.AlterField( + model_name='myapikey', + name='id', + field=models.CharField(editable=False, max_length=150, primary_key=True, serialize=False, unique=True), + ), + ] diff --git a/poem/Poem/api/serializers.py b/poem/Poem/api/serializers.py index 1f8b20245..9408b5b2a 100644 --- a/poem/Poem/api/serializers.py +++ b/poem/Poem/api/serializers.py @@ -1,7 +1,7 @@ from rest_framework import serializers from Poem.poem import models -from Poem.poem_super_admin.models import Probe, MetricTags +from Poem.poem_super_admin.models import Probe, MetricTags, DefaultPort from Poem.users.models import CustUser @@ -23,12 +23,6 @@ class Meta: model = models.Reports -class ServiceFlavourSerializer(serializers.ModelSerializer): - class Meta: - fields = ('name', 'description', ) - model = models.ServiceFlavour - - class UsersSerializer(serializers.ModelSerializer): class Meta: fields = ('first_name', 'last_name', 'username', 'is_active', @@ -64,3 +58,9 @@ class MetricTagsSerializer(serializers.ModelSerializer): class Meta: model = MetricTags fields = "__all__" + + +class DefaultPortsSerializer(serializers.ModelSerializer): + class Meta: + model = DefaultPort + fields = "__all__" diff --git a/poem/Poem/api/tests/test_app.py b/poem/Poem/api/tests/test_app.py index 37325663a..994d5b132 100644 --- a/poem/Poem/api/tests/test_app.py +++ b/poem/Poem/api/tests/test_app.py @@ -167,6 +167,7 @@ def setUp(self): self.view = views.GetConfigOptions.as_view() self.url = '/api/v2/internal/config_options/' self.user = CustUser.objects.create(username='testuser') + self.maxDiff = None @patch('Poem.api.internal_views.app.saml_login_string', return_value='Log in using B2ACCESS') @@ -182,6 +183,7 @@ def test_get_config_options(self, *args): WEBAPI_REPORTSTAGS='https://reports-tags.com', WEBAPI_REPORTSTOPOLOGYGROUPS='https://topology-groups.com', WEBAPI_REPORTSTOPOLOGYENDPOINTS='https://endpoints.com', + WEBAPI_SERVICETYPES='https://topology-servicetypes.com', LINKS_TERMS_PRIVACY={ 'tenant': { 'terms': 'https://terms.of.use.com', @@ -209,6 +211,7 @@ def test_get_config_options(self, *args): 'topologygroups': 'https://topology-groups.com', 'topologyendpoints': 'https://endpoints.com' }, + 'webapiservicetypes': 'https://topology-servicetypes.com', 'tenant_name': 'tenant', 'terms_privacy_links': { 'terms': 'https://terms.of.use.com', diff --git a/poem/Poem/api/tests/test_groupelements.py b/poem/Poem/api/tests/test_groupelements.py index 75fcf0505..d89acb1a4 100644 --- a/poem/Poem/api/tests/test_groupelements.py +++ b/poem/Poem/api/tests/test_groupelements.py @@ -1,12 +1,14 @@ import datetime import json +import factory from Poem.api import views_internal as views from Poem.helpers.history_helpers import serialize_metric from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models from Poem.users.models import CustUser from django.contrib.contenttypes.models import ContentType +from django.db.models.signals import pre_save from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory from rest_framework import status @@ -15,6 +17,7 @@ from .utils_test import encode_data +@factory.django.mute_signals(pre_save) class ListMetricsInGroupAPIViewTests(TenantTestCase): def setUp(self): self.factory = TenantRequestFactory(self.tenant) @@ -969,7 +972,7 @@ def test_add_metric_profile_in_group_regular_user(self): def test_remove_metric_profile_from_group(self): self.group.metricprofiles.add(self.mp2) - self.mp2.groupname='EGI' + self.mp2.groupname = 'EGI' self.mp2.save() self.assertEqual(self.group.metricprofiles.all().count(), 2) data = { diff --git a/poem/Poem/api/tests/test_helpers.py b/poem/Poem/api/tests/test_helpers.py index 4befcae5b..57007204f 100644 --- a/poem/Poem/api/tests/test_helpers.py +++ b/poem/Poem/api/tests/test_helpers.py @@ -2,6 +2,7 @@ import json from unittest.mock import patch, call +import factory import requests from Poem.api.models import MyAPIKey from Poem.helpers.history_helpers import create_comment, update_comment, \ @@ -19,6 +20,7 @@ from django.core import serializers from django.core.management import call_command from django.db import connection +from django.db.models.signals import pre_save from django.test.testcases import TransactionTestCase from django_tenants.test.cases import TenantTestCase from django_tenants.utils import get_tenant_model, get_public_schema_name, \ @@ -52,20 +54,13 @@ def mock_db(tenant, tenant2=False): user = CustUser.objects.create_user(username='testuser') - mt_active = admin_models.MetricTemplateType.objects.create( - name='Active' - ) - mt_passive = admin_models.MetricTemplateType.objects.create( - name='Passive' - ) + active = admin_models.MetricTemplateType.objects.create(name='Active') + passive = admin_models.MetricTemplateType.objects.create(name='Passive') mtag1 = admin_models.MetricTags.objects.create(name='test_tag1') mtag2 = admin_models.MetricTags.objects.create(name='test_tag2') mtag3 = admin_models.MetricTags.objects.create(name='test_tag3') - m_active = poem_models.MetricType.objects.create(name='Active') - m_passive = poem_models.MetricType.objects.create(name='Passive') - tag1 = admin_models.OSTag.objects.create(name='CentOS 6') tag2 = admin_models.OSTag.objects.create(name='CentOS 7') @@ -384,7 +379,7 @@ def mock_db(tenant, tenant2=False): metrictemplate1 = admin_models.MetricTemplate.objects.create( name='argo.AMS-Check', - mtype=mt_active, + mtype=active, probekey=probeversion1_1, description='Some description of argo.AMS-Check metric template.', probeexecutable='["ams-probe"]', @@ -469,7 +464,7 @@ def mock_db(tenant, tenant2=False): metrictemplate2 = admin_models.MetricTemplate.objects.create( name='org.nagios.CertLifetime', - mtype=mt_active, + mtype=active, probekey=probeversion2, probeexecutable='check_http', config='["maxCheckAttempts 2", "timeout 60", ' @@ -502,7 +497,7 @@ def mock_db(tenant, tenant2=False): metrictemplate3 = admin_models.MetricTemplate.objects.create( name='eu.egi.cloud.OpenStack-VM', - mtype=mt_active, + mtype=active, probekey=probeversion3, probeexecutable='novaprobe.py', config='["maxCheckAttempts 2", "timeout 300", ' @@ -539,7 +534,7 @@ def mock_db(tenant, tenant2=False): metrictemplate4 = admin_models.MetricTemplate.objects.create( name='eu.egi.cloud.OpenStack-Swift', - mtype=mt_active, + mtype=active, probekey=probeversion4, probeexecutable='swiftprobe.py', config='["maxCheckAttempts 2", "timeout 300", ' @@ -577,7 +572,7 @@ def mock_db(tenant, tenant2=False): metrictemplate5 = admin_models.MetricTemplate.objects.create( name='org.apel.APEL-Pub', flags='["OBSESS 1", "PASSIVE 1"]', - mtype=mt_passive + mtype=passive ) admin_models.MetricTemplateHistory.objects.create( @@ -601,7 +596,7 @@ def mock_db(tenant, tenant2=False): metrictemplate6 = admin_models.MetricTemplate.objects.create( name='org.nagios.CertLifetime2', - mtype=mt_active, + mtype=active, probekey=probeversion2, probeexecutable='check_http', config='["maxCheckAttempts 4", "timeout 70", ' @@ -635,7 +630,7 @@ def mock_db(tenant, tenant2=False): metrictemplate7 = admin_models.MetricTemplate.objects.create( name='eu.egi.sec.ARC-CE-result', flags='["OBSESS 0", "PASSIVE 1", "VO 1"]', - mtype=mt_passive, + mtype=passive, parent='eu.egi.sec.ARC-CE-submit' ) @@ -660,7 +655,7 @@ def mock_db(tenant, tenant2=False): metrictemplate8 = admin_models.MetricTemplate.objects.create( name='argo.AMSPublisher-Check', - mtype=mt_active, + mtype=active, probekey=probeversion5_1, description='Some description of publisher metric.', probeexecutable='ams-publisher-probe', @@ -743,7 +738,7 @@ def mock_db(tenant, tenant2=False): metrictemplate9 = admin_models.MetricTemplate.objects.create( name='argo.AMS-Check-Old', - mtype=mt_active, + mtype=active, probekey=probeversion1_1, description='Some description of argo.AMS-Check metric template.', probeexecutable='["ams-probe"]', @@ -776,7 +771,7 @@ def mock_db(tenant, tenant2=False): metrictemplate10 = admin_models.MetricTemplate.objects.create( name='test.Metric-Template', - mtype=mt_active, + mtype=active, probekey=probeversion6_1, description='Metric template for testing probe.', probeexecutable='["test-probe"]', @@ -830,7 +825,7 @@ def mock_db(tenant, tenant2=False): metrictemplate11 = admin_models.MetricTemplate.objects.create( name='test.MetricTemplate', - mtype=mt_active, + mtype=active, probekey=probeversion7_2, description='Metric template for something.', probeexecutable='["test-probe-2"]', @@ -872,16 +867,10 @@ def mock_db(tenant, tenant2=False): metric1 = poem_models.Metric.objects.create( name='argo.AMS-Check', group=group, - mtype=m_active, - probekey=probeversion1_2, - description='Some description of argo.AMS-Check metric template.', - probeexecutable='["ams-probe"]', + probeversion=probeversion1_2.__str__(), config='["maxCheckAttempts 4", "timeout 60",' ' "path /usr/libexec/argo-monitoring/probes/argo",' - ' "interval 5", "retryInterval 2"]', - attribute='["argo.ams_TOKEN --token"]', - flags='["OBSESS 1"]', - parameter='["--project EGI"]' + ' "interval 5", "retryInterval 2"]' ) poem_models.TenantHistory.objects.create( @@ -896,17 +885,8 @@ def mock_db(tenant, tenant2=False): metric2 = poem_models.Metric.objects.create( name=metrictemplate2.name, group=group, - mtype=m_active, - description=metrictemplate2.description, - probekey=metrictemplate2.probekey, - probeexecutable=metrictemplate2.probeexecutable, - config=metrictemplate2.config, - attribute=metrictemplate2.attribute, - dependancy=metrictemplate2.dependency, - flags=metrictemplate2.flags, - files=metrictemplate2.files, - parameter=metrictemplate2.parameter, - fileparameter=metrictemplate2.fileparameter, + probeversion=metrictemplate2.probekey.__str__(), + config=metrictemplate2.config ) poem_models.TenantHistory.objects.create( @@ -922,19 +902,10 @@ def mock_db(tenant, tenant2=False): metric3 = poem_models.Metric.objects.create( name=metrictemplate4.name, group=group, - mtype=m_active, - description=metrictemplate4.description, - probekey=metrictemplate4.probekey, - probeexecutable=metrictemplate4.probeexecutable, + probeversion=metrictemplate4.probekey.__str__(), config='["maxCheckAttempts 3", "timeout 300", ' '"path /usr/libexec/argo-monitoring/probes/fedcloud", ' - '"interval 80", "retryInterval 15"]', - attribute=metrictemplate4.attribute, - dependancy=metrictemplate4.dependency, - flags=metrictemplate4.flags, - files=metrictemplate4.files, - parameter=metrictemplate4.parameter, - fileparameter=metrictemplate4.fileparameter + '"interval 80", "retryInterval 15"]' ) poem_models.TenantHistory.objects.create( @@ -950,17 +921,7 @@ def mock_db(tenant, tenant2=False): metric4 = poem_models.Metric.objects.create( name=metrictemplate5.name, group=group, - mtype=m_passive, - description=metrictemplate5.description, - probekey=metrictemplate5.probekey, - probeexecutable=metrictemplate5.probeexecutable, - config=metrictemplate5.config, - attribute=metrictemplate5.attribute, - dependancy=metrictemplate5.dependency, - flags=metrictemplate5.flags, - files=metrictemplate5.files, - parameter=metrictemplate5.parameter, - fileparameter=metrictemplate5.fileparameter, + config=metrictemplate5.config ) poem_models.TenantHistory.objects.create( @@ -976,17 +937,8 @@ def mock_db(tenant, tenant2=False): metric5 = poem_models.Metric.objects.create( name=metrictemplate10.name, group=group, - mtype=m_active, - description=metrictemplate10.description, - probekey=metrictemplate10.probekey, - probeexecutable=metrictemplate10.probeexecutable, - config=metrictemplate10.config, - attribute=metrictemplate10.attribute, - dependancy=metrictemplate10.dependency, - flags=metrictemplate10.flags, - files=metrictemplate10.files, - parameter=metrictemplate10.parameter, - fileparameter=metrictemplate10.fileparameter, + probeversion=metrictemplate10.probekey.__str__(), + config=metrictemplate10.config ) poem_models.TenantHistory.objects.create( @@ -1004,23 +956,14 @@ def mock_db(tenant, tenant2=False): group1 = poem_models.GroupOfMetrics.objects.create( name=Tenant.objects.get(schema_name='test2').name.upper() ) - m_active1 = poem_models.MetricType.objects.create(name='Active') - m_passive1 = poem_models.MetricType.objects.create(name='Passive') metric1a = poem_models.Metric.objects.create( name='argo.AMS-Check', group=group1, - mtype=m_active1, - probekey=probeversion1_2, - description='Some description of argo.AMS-Check metric ' - 'template.', - probeexecutable='["ams-probe"]', + probeversion=probeversion1_2.__str__(), config='["maxCheckAttempts 4", "timeout 60",' ' "path /usr/libexec/argo-monitoring/probes/argo",' - ' "interval 5", "retryInterval 2"]', - attribute='["argo.ams_TOKEN --token"]', - flags='["OBSESS 1"]', - parameter='["--project EGI"]' + ' "interval 5", "retryInterval 2"]' ) poem_models.TenantHistory.objects.create( @@ -1037,17 +980,8 @@ def mock_db(tenant, tenant2=False): metric2a = poem_models.Metric.objects.create( name=metrictemplate2.name, group=group1, - mtype=m_active1, - description=metrictemplate2.description, - probekey=metrictemplate2.probekey, - probeexecutable=metrictemplate2.probeexecutable, - config=metrictemplate2.config, - attribute=metrictemplate2.attribute, - dependancy=metrictemplate2.dependency, - flags=metrictemplate2.flags, - files=metrictemplate2.files, - parameter=metrictemplate2.parameter, - fileparameter=metrictemplate2.fileparameter, + probeversion=metrictemplate2.probekey.__str__(), + config=metrictemplate2.config ) poem_models.TenantHistory.objects.create( @@ -1063,19 +997,10 @@ def mock_db(tenant, tenant2=False): metric3a = poem_models.Metric.objects.create( name=metrictemplate4.name, group=group1, - mtype=m_active1, - description=metrictemplate4.description, - probekey=metrictemplate4.probekey, - probeexecutable=metrictemplate4.probeexecutable, + probeversion=metrictemplate4.probekey.__str__(), config='["maxCheckAttempts 3", "timeout 300", ' '"path /usr/libexec/argo-monitoring/probes/fedcloud", ' - '"interval 80", "retryInterval 15"]', - attribute=metrictemplate4.attribute, - dependancy=metrictemplate4.dependency, - flags=metrictemplate4.flags, - files=metrictemplate4.files, - parameter=metrictemplate4.parameter, - fileparameter=metrictemplate4.fileparameter + '"interval 80", "retryInterval 15"]' ) poem_models.TenantHistory.objects.create( @@ -1091,17 +1016,7 @@ def mock_db(tenant, tenant2=False): metric4a = poem_models.Metric.objects.create( name=metrictemplate5.name, group=group1, - mtype=m_passive1, - description=metrictemplate5.description, - probekey=metrictemplate5.probekey, - probeexecutable=metrictemplate5.probeexecutable, - config=metrictemplate5.config, - attribute=metrictemplate5.attribute, - dependancy=metrictemplate5.dependency, - flags=metrictemplate5.flags, - files=metrictemplate5.files, - parameter=metrictemplate5.parameter, - fileparameter=metrictemplate5.fileparameter, + config=metrictemplate5.config ) poem_models.TenantHistory.objects.create( @@ -1117,17 +1032,8 @@ def mock_db(tenant, tenant2=False): metric5a = poem_models.Metric.objects.create( name=metrictemplate10.name, group=group1, - mtype=m_active1, - description=metrictemplate10.description, - probekey=metrictemplate10.probekey, - probeexecutable=metrictemplate10.probeexecutable, - config=metrictemplate10.config, - attribute=metrictemplate10.attribute, - dependancy=metrictemplate10.dependency, - flags=metrictemplate10.flags, - files=metrictemplate10.files, - parameter=metrictemplate10.parameter, - fileparameter=metrictemplate10.fileparameter, + probeversion=metrictemplate10.probekey.__str__(), + config=metrictemplate10.config ) poem_models.TenantHistory.objects.create( @@ -1141,6 +1047,7 @@ def mock_db(tenant, tenant2=False): ) +@factory.django.mute_signals(pre_save) class HistoryHelpersTests(TenantTestCase): def setUp(self): tag = admin_models.OSTag.objects.create(name='CentOS 6') @@ -1188,9 +1095,6 @@ def setUp(self): self.active = admin_models.MetricTemplateType.objects.create( name='Active' ) - self.metric_active = poem_models.MetricType.objects.create( - name='Active' - ) probe_history1 = admin_models.ProbeHistory.objects.create( object_id=self.probe1, @@ -1352,22 +1256,89 @@ def setUp(self): ) mth3.tags.add(self.metrictag1) - self.metric1 = poem_models.Metric.objects.create( + self.mt4 = admin_models.MetricTemplate.objects.create( name='metric-1', description='Description of metric-1.', parent='["parent"]', config='["maxCheckAttempts 3", "timeout 60",' ' "path $USER", "interval 5", "retryInterval 3"]', attribute='["attribute-key attribute-value"]', - dependancy='["dependency-key1 dependency-value1", ' + dependency='["dependency-key1 dependency-value1", ' '"dependency-key2 dependency-value2"]', flags='["flags-key flags-value"]', files='["files-key files-value"]', parameter='["parameter-key parameter-value"]', fileparameter='["fileparameter-key fileparameter-value"]', - mtype=self.metric_active, + mtype=self.active, probekey=probe_history1 ) + self.mt4.tags.add(self.metrictag1, self.metrictag2) + + self.mth4 = admin_models.MetricTemplateHistory.objects.create( + object_id=self.mt4, + name=self.mt4.name, + mtype=self.mt4.mtype, + probekey=self.mt4.probekey, + description=self.mt4.description, + parent=self.mt4.parent, + probeexecutable=self.mt4.probeexecutable, + config=self.mt4.config, + attribute=self.mt4.attribute, + dependency=self.mt4.dependency, + flags=self.mt4.flags, + files=self.mt4.files, + parameter=self.mt4.parameter, + fileparameter=self.mt4.fileparameter, + date_created=datetime.datetime.now(), + version_comment='Initial version.', + version_user='testuser' + ) + self.mth4.tags.add(self.metrictag1, self.metrictag2) + + self.metric1 = poem_models.Metric.objects.create( + name='metric-1', + config='["maxCheckAttempts 3", "timeout 60",' + ' "path $USER", "interval 5", "retryInterval 3"]', + probeversion=probe_history1.__str__() + ) + + mt5 = admin_models.MetricTemplate.objects.create( + name='metric-2', + description='Description of metric-2.', + parent='', + config='["maxCheckAttempts 3", "timeout 60",' + ' "path $USER", "interval 5", "retryInterval 3"]', + attribute='["attribute-key attribute-value"]', + dependency='["dependency-key1 dependency-value1"]', + flags='["flags-key flags-value", "flags-key1 flags-value1"]', + files='["files-key files-value"]', + parameter='["parameter-key parameter-value"]', + fileparameter='["fileparameter-key fileparameter-value"]', + mtype=self.active, + probekey=self.probe_history2 + ) + mt5.tags.add(self.metrictag1, self.metrictag2) + + self.mth5 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt5, + name=mt5.name, + mtype=mt5.mtype, + probekey=mt5.probekey, + description=mt5.description, + parent=mt5.parent, + probeexecutable=mt5.probeexecutable, + config=mt5.config, + attribute=mt5.attribute, + dependency=mt5.dependency, + flags=mt5.flags, + files=mt5.files, + parameter=mt5.parameter, + fileparameter=mt5.fileparameter, + date_created=datetime.datetime.now(), + version_comment='Newer version.', + version_user='testuser' + ) + self.mth5.tags.add(self.metrictag1, self.metrictag2) poem_models.TenantHistory.objects.create( object_id=self.metric1.id, @@ -1684,31 +1655,18 @@ def test_create_comment_for_probe_if_initial(self): self.assertEqual(comment, 'Initial version.') def test_create_comment_for_metric(self): - metric = poem_models.Metric.objects.get( - id=self.metric1.id - ) - metric.name = 'metric-2' - metric.description = 'Description for metric-2.' - metric.probekey = self.probe_history2 - metric.probeexecutable = '["new-probeexecutable"]' + metric = poem_models.Metric.objects.get(id=self.metric1.id) + metric.name = "metric-2" + metric.probeversion = self.probe_history2.__str__() metric.config = '["maxCheckAttempts 3", "timeout 60",' \ ' "path $USER", "interval 5", "retryInterval 3"]' - metric.attribute = '["attribute-key attribute-value"]' - metric.dependancy = '["dependency-key1 dependency-value1"]' - metric.flags = '["flags-key flags-value", "flags-key1 flags-value2"]' - metric.files = '["files-key files-value"]' - metric.parameter = '["parameter-key parameter-value"]' - metric.fileparameter = '["fileparameter-key fileparameter-value"]' - metric.mtype = self.metric_active - metric.parent = '' metric.save() - serialized_data = serializers.serialize( - 'json', [metric], - use_natural_foreign_keys=True, - use_natural_primary_keys=True + serialized_data = serialize_metric( + metric, tags=[self.metrictag1, self.metrictag2] + ) + comment = create_comment( + self.metric1, self.ct_metric, serialized_data ) - comment = create_comment(self.metric1, self.ct_metric, - serialized_data) comment_set = set() for item in json.loads(comment): comment_set.add(json.dumps(item)) @@ -1718,34 +1676,30 @@ def test_create_comment_for_metric(self): '{"deleted": {"fields": ["dependancy"], ' '"object": ["dependency-key2"]}}', '{"added": {"fields": ["flags"], "object": ["flags-key1"]}}', - '{"added": {"fields": ["probeexecutable"]}}', '{"changed": {"fields": ["description", "name", "probekey"]}}', '{"deleted": {"fields": ["parent"]}}' } ) def test_create_comment_for_metric_if_field_deleted_from_model(self): - mt = poem_models.Metric.objects.get( - id=self.metric1.id - ) + mt = admin_models.MetricTemplateHistory.objects.get(id=self.mth5.id) mt.name = 'metric-2' mt.description = 'Description of metric-1.' - mt.probekey = self.probe_history2 mt.probeexecutable = '["new-probeexecutable"]' - mt.config = '["maxCheckAttempts 4", "timeout 60",' \ - ' "path $USER", "interval 5", "retryInterval 3"]' mt.attribute = '["attribute-key attribute-value"]' mt.dependancy = '["dependency-key1 dependency-value1"]' mt.flags = '["flags-key flags-value", "flags-key1 flags-value2"]' mt.files = '["files-key files-value"]' mt.parameter = '["parameter-key parameter-value"]' - mt.mtype = self.metric_active mt.parent = '' mt.save() - serialized_data = serializers.serialize( - 'json', [mt], - use_natural_foreign_keys=True, - use_natural_primary_keys=True + metric = poem_models.Metric.objects.get(id=self.metric1.id) + metric.name = "metric-2" + metric.probeversion = self.probe_history2.__str__() + metric.config = '["maxCheckAttempts 4", "timeout 60",' \ + ' "path $USER", "interval 5", "retryInterval 3"]' + serialized_data = serialize_metric( + metric, tags=[self.metrictag1, self.metrictag2] ) # let's say fileparameter field no longer exists in the model dict_serialized = json.loads(serialized_data) @@ -1771,28 +1725,26 @@ def test_create_comment_for_metric_if_field_deleted_from_model(self): ) def test_create_comment_for_metric_if_field_added_to_model(self): - mt = poem_models.Metric.objects.get( - id=self.metric1.id - ) + mt = admin_models.MetricTemplateHistory.objects.get(id=self.mth5.id) mt.name = 'metric-2' mt.description = 'Description of metric-1.' - mt.probekey = self.probe_history2 mt.probeexecutable = '["new-probeexecutable"]' - mt.config = '["maxCheckAttempts 4", "timeout 60",' \ - ' "path $USER", "interval 5", "retryInterval 4"]' mt.attribute = '["attribute-key attribute-value"]' mt.dependancy = '["dependency-key1 dependency-value1"]' mt.flags = '["flags-key flags-value", "flags-key1 flags-value2"]' mt.files = '["files-key files-value"]' mt.parameter = '["parameter-key parameter-value"]' mt.fileparameter = '["fileparameter-key fileparameter-value"]' - mt.mtype = self.metric_active mt.parent = '' mt.save() - serialized_data = serializers.serialize( - 'json', [mt], - use_natural_foreign_keys=True, - use_natural_primary_keys=True + metric = poem_models.Metric.objects.get(id=self.metric1.id) + metric.name = "metric-2" + metric.probeversion = self.probe_history2.__str__() + metric.config = '["maxCheckAttempts 4", "timeout 60",' \ + ' "path $USER", "interval 5", "retryInterval 4"]' + metric.save() + serialized_data = serialize_metric( + metric, tags=[self.metrictag1, self.metrictag2] ) # let's say mock_field was added to model dict_serialized = json.loads(serialized_data) @@ -1819,25 +1771,12 @@ def test_create_comment_for_metric_if_field_added_to_model(self): def test_create_comment_for_metric_if_initial(self): mt = poem_models.Metric.objects.create( - name='metric-template-2', - description='Description for metric-2.', - probekey=self.probe_history2, - probeexecutable='["new-probeexecutable"]', + name='metric-template-3', + probeversion=self.mt2.probekey.__str__(), config='["maxCheckAttempts 4", "timeout 60",' - ' "path $USER", "interval 5", "retryInterval 4"]', - attribute='["attribute-key attribute-value"]', - dependancy='["dependency-key1 dependency-value1"]', - flags='["flags-key flags-value", "flags-key1 flags-value2"]', - files='["files-key files-value"]', - parameter='["parameter-key parameter-value"]', - fileparameter='["fileparameter-key fileparameter-value"]', - mtype=self.metric_active - ) - serialized_data = serializers.serialize( - 'json', [mt], - use_natural_foreign_keys=True, - use_natural_primary_keys=True + ' "path $USER", "interval 5", "retryInterval 4"]' ) + serialized_data = serialize_metric(mt, tags=[self.metrictag1]) comment = create_comment(mt, self.ct_metric, serialized_data) self.assertEqual(comment, 'Initial version.') @@ -1846,10 +1785,8 @@ def test_create_comment_for_metric_if_group_was_none(self): m = self.metric1 m.group = group m.save() - serialized_data = serializers.serialize( - 'json', [m], - use_natural_foreign_keys=True, - use_natural_primary_keys=True + serialized_data = serialize_metric( + m, tags=[self.metrictag1, self.metrictag2] ) comment = create_comment(m, self.ct_metric, serialized_data) self.assertEqual(comment, '[{"added": {"fields": ["group"]}}]') @@ -2119,12 +2056,10 @@ def setUp(self): self.user = CustUser.objects.get(username='testuser') - self.m_active = poem_models.MetricType.objects.get(name='Active') - self.m_passive = poem_models.MetricType.objects.get(name='Passive') - self.mt_active = admin_models.MetricTemplateType.objects.get( + self.active = admin_models.MetricTemplateType.objects.get( name='Active' ) - self.mt_passive = admin_models.MetricTemplateType.objects.get( + self.passive = admin_models.MetricTemplateType.objects.get( name='Passive' ) @@ -2140,6 +2075,9 @@ def setUp(self): self.metrictemplate1 = admin_models.MetricTemplate.objects.get( name='argo.AMS-Check' ) + self.metrictemplate2 = admin_models.MetricTemplate.objects.get( + name="org.nagios.CertLifetime" + ) self.metrictemplate3 = admin_models.MetricTemplate.objects.get( name='eu.egi.cloud.OpenStack-VM' ) @@ -2152,21 +2090,26 @@ def setUp(self): self.metrictemplate7 = admin_models.MetricTemplate.objects.get( name='eu.egi.sec.ARC-CE-result' ) - metrictemplate8 = admin_models.MetricTemplate.objects.get( name='argo.AMSPublisher-Check' ) - - self.mt8_history2 = admin_models.MetricTemplateHistory.objects.get( - object_id=metrictemplate8, version_comment='Newer version.' + self.metrictemplate10 = admin_models.MetricTemplate.objects.get( + name="test.Metric-Template" ) probeversion1 = admin_models.ProbeHistory.objects.filter( name='ams-probe' - ) + ).order_by("-date_created") self.probeversion1_2 = probeversion1[1] self.probeversion1_3 = probeversion1[0] + self.mt1_history2 = admin_models.MetricTemplateHistory.objects.get( + object_id=self.metrictemplate1, probekey=self.probeversion1_2 + ) + self.mt8_history2 = admin_models.MetricTemplateHistory.objects.get( + object_id=metrictemplate8, version_comment='Newer version.' + ) + self.metric1 = poem_models.Metric.objects.get( name='argo.AMS-Check' ) @@ -2209,83 +2152,91 @@ def test_import_active_metrics_successfully(self): serialized_data2 = json.loads(history2[0].serialized_data)[0]['fields'] self.assertEqual(history1.count(), 1) self.assertEqual(metric1.name, self.metrictemplate3.name) - self.assertEqual(metric1.mtype, self.m_active) self.assertEqual(metric1.group, self.group) - self.assertEqual(metric1.description, self.metrictemplate3.description) - self.assertEqual(metric1.probekey, self.metrictemplate3.probekey) self.assertEqual( - metric1.probeexecutable, self.metrictemplate3.probeexecutable + metric1.probeversion, self.metrictemplate3.probekey.__str__() ) self.assertEqual(metric1.config, self.metrictemplate3.config) - self.assertEqual(metric1.attribute, self.metrictemplate3.attribute) - self.assertEqual(metric1.dependancy, self.metrictemplate3.dependency) - self.assertEqual(metric1.flags, self.metrictemplate3.flags) - self.assertEqual(metric1.files, self.metrictemplate3.files) - self.assertEqual(metric1.parameter, self.metrictemplate3.parameter) + self.assertEqual(serialized_data1['name'], metric1.name) self.assertEqual( - metric1.fileparameter, self.metrictemplate3.fileparameter + serialized_data1['mtype'][0], self.metrictemplate3.mtype.name ) - self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'][0], metric1.mtype.name) self.assertEqual(serialized_data1['tags'], []) self.assertEqual(serialized_data1['group'][0], metric1.group.name) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual( + serialized_data1['description'], self.metrictemplate3.description + ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + self.metrictemplate3.probekey.name, + self.metrictemplate3.probekey.package.version + ] ) self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], + self.metrictemplate3.probeexecutable ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], self.metrictemplate3.attribute + ) + self.assertEqual( + serialized_data1['dependancy'], self.metrictemplate3.dependency + ) + self.assertEqual(serialized_data1['flags'], self.metrictemplate3.flags) + self.assertEqual(serialized_data1['files'], self.metrictemplate3.files) + self.assertEqual( + serialized_data1['parameter'], self.metrictemplate3.parameter + ) + self.assertEqual( + serialized_data1['fileparameter'], + self.metrictemplate3.fileparameter ) self.assertEqual(history2.count(), 1) self.assertEqual(metric2.name, self.metrictemplate6.name) - self.assertEqual(metric2.mtype, self.m_active) self.assertEqual(metric2.group, self.group) - self.assertEqual(metric2.description, self.metrictemplate6.description) - self.assertEqual(metric2.probekey, self.metrictemplate6.probekey) self.assertEqual( - metric2.probeexecutable, self.metrictemplate6.probeexecutable + metric2.probeversion, self.metrictemplate6.probekey.__str__() ) self.assertEqual(metric2.config, self.metrictemplate6.config) - self.assertEqual(metric2.attribute, self.metrictemplate6.attribute) - self.assertEqual(metric2.dependancy, self.metrictemplate6.dependency) - self.assertEqual(metric2.flags, self.metrictemplate6.flags) - self.assertEqual(metric2.files, self.metrictemplate6.files) - self.assertEqual(metric2.parameter, self.metrictemplate6.parameter) + self.assertEqual(serialized_data2['name'], metric2.name) self.assertEqual( - metric2.fileparameter, self.metrictemplate6.fileparameter + serialized_data2['mtype'][0], self.metrictemplate6.mtype.name ) - self.assertEqual(serialized_data2['name'], metric2.name) - self.assertEqual(serialized_data2['mtype'][0], metric2.mtype.name) self.assertEqual( serialized_data2['tags'], [['test_tag1'], ['test_tag2']] ) self.assertEqual(serialized_data2['group'][0], metric2.group.name) - self.assertEqual(serialized_data2['description'], metric2.description) + self.assertEqual( + serialized_data2['description'], self.metrictemplate6.description + ) self.assertEqual( serialized_data2['probekey'], - [metric2.probekey.name, metric2.probekey.package.version] + [ + self.metrictemplate6.probekey.name, + self.metrictemplate6.probekey.package.version + ] ) self.assertEqual( - serialized_data2['probeexecutable'], metric2.probeexecutable + serialized_data2['probeexecutable'], + self.metrictemplate6.probeexecutable ) self.assertEqual(serialized_data2['config'], metric2.config) - self.assertEqual(serialized_data2['attribute'], metric2.attribute) - self.assertEqual(serialized_data2['dependancy'], metric2.dependancy) - self.assertEqual(serialized_data2['flags'], metric2.flags) - self.assertEqual(serialized_data2['files'], metric2.files) - self.assertEqual(serialized_data2['parameter'], metric2.parameter) self.assertEqual( - serialized_data2['fileparameter'], metric2.fileparameter + serialized_data2['attribute'], self.metrictemplate6.attribute + ) + self.assertEqual( + serialized_data2['dependancy'], self.metrictemplate6.dependency + ) + self.assertEqual(serialized_data2['flags'], self.metrictemplate6.flags) + self.assertEqual(serialized_data2['files'], self.metrictemplate6.files) + self.assertEqual( + serialized_data2['parameter'], self.metrictemplate6.parameter + ) + self.assertEqual( + serialized_data2['fileparameter'], + self.metrictemplate6.fileparameter ) def test_import_passive_metric_successfully(self): @@ -2307,39 +2258,38 @@ def test_import_passive_metric_successfully(self): serialized_data1 = json.loads(history1[0].serialized_data)[0]['fields'] self.assertEqual(history1.count(), 1) self.assertEqual(metric1.name, self.metrictemplate7.name) - self.assertEqual(metric1.mtype, self.m_passive) self.assertEqual(metric1.group, self.group) - self.assertEqual(metric1.description, self.metrictemplate7.description) - self.assertEqual(metric1.probekey, self.metrictemplate7.probekey) - self.assertEqual( - metric1.probeexecutable, self.metrictemplate7.probeexecutable - ) + self.assertEqual(metric1.probeversion, None) self.assertEqual(metric1.config, self.metrictemplate7.config) - self.assertEqual(metric1.attribute, self.metrictemplate7.attribute) - self.assertEqual(metric1.dependancy, self.metrictemplate7.dependency) - self.assertEqual(metric1.flags, self.metrictemplate7.flags) - self.assertEqual(metric1.files, self.metrictemplate7.files) - self.assertEqual(metric1.parameter, self.metrictemplate7.parameter) + self.assertEqual(serialized_data1['name'], metric1.name) self.assertEqual( - metric1.fileparameter, self.metrictemplate7.fileparameter + serialized_data1['mtype'][0], self.metrictemplate7.mtype.name ) - self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'][0], metric1.mtype.name) self.assertEqual(serialized_data1['tags'], []) self.assertEqual(serialized_data1['group'][0], metric1.group.name) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual( + serialized_data1['description'], self.metrictemplate7.description + ) self.assertEqual(serialized_data1['probekey'], None) self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], + self.metrictemplate7.probeexecutable ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], self.metrictemplate7.attribute + ) + self.assertEqual( + serialized_data1['dependancy'], self.metrictemplate7.dependency + ) + self.assertEqual(serialized_data1['flags'], self.metrictemplate7.flags) + self.assertEqual(serialized_data1['files'], self.metrictemplate7.files) + self.assertEqual( + serialized_data1['parameter'], self.metrictemplate7.parameter + ) + self.assertEqual( + serialized_data1['fileparameter'], + self.metrictemplate7.fileparameter ) def test_import_active_metric_with_warning(self): @@ -2361,42 +2311,45 @@ def test_import_active_metric_with_warning(self): serialized_data1 = json.loads(history1[0].serialized_data)[0]['fields'] self.assertEqual(history1.count(), 1) self.assertEqual(metric1.name, self.mt8_history2.name) - self.assertEqual(metric1.mtype, self.m_active) self.assertEqual(metric1.group, self.group) - self.assertEqual(metric1.description, self.mt8_history2.description) - self.assertEqual(metric1.probekey, self.mt8_history2.probekey) self.assertEqual( - metric1.probeexecutable, self.mt8_history2.probeexecutable + metric1.probeversion, self.mt8_history2.probekey.__str__() ) self.assertEqual(metric1.config, self.mt8_history2.config) - self.assertEqual(metric1.attribute, self.mt8_history2.attribute) - self.assertEqual(metric1.dependancy, self.mt8_history2.dependency) - self.assertEqual(metric1.flags, self.mt8_history2.flags) - self.assertEqual(metric1.files, self.mt8_history2.files) - self.assertEqual(metric1.parameter, self.mt8_history2.parameter) + self.assertEqual(serialized_data1['name'], metric1.name) self.assertEqual( - metric1.fileparameter, self.mt8_history2.fileparameter + serialized_data1['mtype'][0], self.mt8_history2.mtype.name ) - self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'][0], metric1.mtype.name) self.assertEqual(serialized_data1['tags'], [['test_tag1']]) self.assertEqual(serialized_data1['group'][0], metric1.group.name) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual( + serialized_data1['description'], self.mt8_history2.description + ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + self.mt8_history2.probekey.name, + self.mt8_history2.probekey.package.version + ] ) self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], + self.mt8_history2.probeexecutable ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], self.mt8_history2.attribute + ) + self.assertEqual( + serialized_data1['dependancy'], self.mt8_history2.dependency + ) + self.assertEqual(serialized_data1['flags'], self.mt8_history2.flags) + self.assertEqual(serialized_data1['files'], self.mt8_history2.files) + self.assertEqual( + serialized_data1['parameter'], self.mt8_history2.parameter + ) + self.assertEqual( + serialized_data1['fileparameter'], self.mt8_history2.fileparameter ) def test_import_active_metric_if_package_already_exists_with_diff_version( @@ -2419,44 +2372,46 @@ def test_import_active_metric_if_package_already_exists_with_diff_version( serialized_data1 = json.loads(history1[0].serialized_data)[0]['fields'] self.assertEqual(history1.count(), 1) self.assertEqual(metric1.name, self.metric5.name) - self.assertEqual(metric1.mtype, self.metric5.mtype) self.assertEqual(metric1.group, self.metric5.group) - self.assertEqual(metric1.description, self.metric5.description) - self.assertEqual(metric1.probekey, self.metric5.probekey) - self.assertEqual( - metric1.probeexecutable, self.metric5.probeexecutable - ) + self.assertEqual(metric1.probeversion, self.metric5.probeversion) self.assertEqual(metric1.config, self.metric5.config) - self.assertEqual(metric1.attribute, self.metric5.attribute) - self.assertEqual(metric1.dependancy, self.metric5.dependancy) - self.assertEqual(metric1.flags, self.metric5.flags) - self.assertEqual(metric1.files, self.metric5.files) - self.assertEqual(metric1.parameter, self.metric5.parameter) + self.assertEqual(serialized_data1['name'], metric1.name) self.assertEqual( - metric1.fileparameter, self.metric5.fileparameter + serialized_data1['mtype'][0], self.metrictemplate10.mtype.name ) - self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'][0], metric1.mtype.name) self.assertEqual( serialized_data1['tags'], [['test_tag1'], ['test_tag2']] ) self.assertEqual(serialized_data1['group'][0], metric1.group.name) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual( + serialized_data1['description'], self.metrictemplate10.description + ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + self.metrictemplate10.probekey.name, + self.metrictemplate10.probekey.package.version + ] ) self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], + self.metrictemplate10.probeexecutable ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], self.metrictemplate10.attribute + ) + self.assertEqual( + serialized_data1['dependancy'], self.metrictemplate10.dependency + ) + self.assertEqual(serialized_data1['flags'], self.metrictemplate10.flags) + self.assertEqual(serialized_data1['files'], self.metrictemplate10.files) + self.assertEqual( + serialized_data1['parameter'], self.metrictemplate10.parameter + ) + self.assertEqual( + serialized_data1['fileparameter'], + self.metrictemplate10.fileparameter ) poem_models.Metric.objects.get(name='argo.AMS-Check') poem_models.Metric.objects.get(name='org.nagios.CertLifetime') @@ -2486,91 +2441,92 @@ def test_import_active_metrics_with_error(self): serialized_data1 = json.loads(history1[0].serialized_data)[0]['fields'] self.assertEqual(history1.count(), 1) self.assertEqual(metric1.name, self.metric1.name) - self.assertEqual(metric1.mtype, self.metric1.mtype) self.assertEqual(metric1.group, self.metric1.group) - self.assertEqual(metric1.description, self.metric1.description) - self.assertEqual(metric1.probekey, self.metric1.probekey) - self.assertEqual( - metric1.probeexecutable, self.metric1.probeexecutable - ) + self.assertEqual(metric1.probeversion, self.metric1.probeversion) self.assertEqual(metric1.config, self.metric1.config) - self.assertEqual(metric1.attribute, self.metric1.attribute) - self.assertEqual(metric1.dependancy, self.metric1.dependancy) - self.assertEqual(metric1.flags, self.metric1.flags) - self.assertEqual(metric1.files, self.metric1.files) - self.assertEqual(metric1.parameter, self.metric1.parameter) + self.assertEqual(serialized_data1['name'], metric1.name) self.assertEqual( - metric1.fileparameter, self.metric1.fileparameter + serialized_data1['mtype'][0], self.mt1_history2.mtype.name ) - self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'][0], metric1.mtype.name) self.assertEqual( serialized_data1['tags'], [['test_tag1'], ['test_tag2'], ['test_tag3']] ) self.assertEqual(serialized_data1['group'][0], metric1.group.name) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual( + serialized_data1['description'], self.mt1_history2.description + ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + self.mt1_history2.probekey.name, + self.mt1_history2.probekey.package.version + ] ) self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], + self.mt1_history2.probeexecutable ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], self.mt1_history2.attribute ) - metric2 = poem_models.Metric.objects.get( - name='org.nagios.CertLifetime' + self.assertEqual( + serialized_data1['dependancy'], self.mt1_history2.dependency ) + self.assertEqual(serialized_data1['flags'], self.mt1_history2.flags) + self.assertEqual(serialized_data1['files'], self.mt1_history2.files) + self.assertEqual( + serialized_data1['parameter'], self.mt1_history2.parameter + ) + self.assertEqual( + serialized_data1['fileparameter'], self.mt1_history2.fileparameter + ) + metric2 = poem_models.Metric.objects.get(name='org.nagios.CertLifetime') history2 = poem_models.TenantHistory.objects.filter( object_id=metric2.id ).order_by('-date_created') serialized_data2 = json.loads(history2[0].serialized_data)[0]['fields'] self.assertEqual(history2.count(), 1) self.assertEqual(metric2.name, self.metric2.name) - self.assertEqual(metric2.mtype, self.metric2.mtype) self.assertEqual(metric2.group, self.metric2.group) - self.assertEqual(metric2.description, self.metric2.description) - self.assertEqual(metric2.probekey, self.metric2.probekey) - self.assertEqual( - metric2.probeexecutable, self.metric2.probeexecutable - ) + self.assertEqual(metric2.probeversion, self.metric2.probeversion) self.assertEqual(metric2.config, self.metric2.config) - self.assertEqual(metric2.attribute, self.metric2.attribute) - self.assertEqual(metric2.dependancy, self.metric2.dependancy) - self.assertEqual(metric2.flags, self.metric2.flags) - self.assertEqual(metric2.files, self.metric2.files) - self.assertEqual(metric2.parameter, self.metric2.parameter) + self.assertEqual(serialized_data2['name'], metric2.name) self.assertEqual( - metric2.fileparameter, self.metric2.fileparameter + serialized_data2['mtype'][0], self.metrictemplate2.mtype.name ) - self.assertEqual(serialized_data2['name'], metric2.name) - self.assertEqual(serialized_data2['mtype'][0], metric2.mtype.name) self.assertEqual(serialized_data2['tags'], [['test_tag2']]) self.assertEqual(serialized_data2['group'][0], metric2.group.name) - self.assertEqual(serialized_data2['description'], metric2.description) + self.assertEqual( + serialized_data2['description'], self.metrictemplate2.description + ) self.assertEqual( serialized_data2['probekey'], - [metric2.probekey.name, metric2.probekey.package.version] + [ + self.metrictemplate2.probekey.name, + self.metrictemplate2.probekey.package.version + ] ) self.assertEqual( - serialized_data2['probeexecutable'], metric2.probeexecutable + serialized_data2['probeexecutable'], + self.metrictemplate2.probeexecutable ) self.assertEqual(serialized_data2['config'], metric2.config) - self.assertEqual(serialized_data2['attribute'], metric2.attribute) - self.assertEqual(serialized_data2['dependancy'], metric2.dependancy) - self.assertEqual(serialized_data2['flags'], metric2.flags) - self.assertEqual(serialized_data2['files'], metric2.files) - self.assertEqual(serialized_data2['parameter'], metric2.parameter) self.assertEqual( - serialized_data2['fileparameter'], metric2.fileparameter + serialized_data2['attribute'], self.metrictemplate2.attribute + ) + self.assertEqual( + serialized_data2['dependancy'], self.metrictemplate2.dependency + ) + self.assertEqual(serialized_data2['flags'], self.metrictemplate2.flags) + self.assertEqual(serialized_data2['files'], self.metrictemplate2.files) + self.assertEqual( + serialized_data2['parameter'], self.metrictemplate2.parameter + ) + self.assertEqual( + serialized_data2['fileparameter'], + self.metrictemplate2.fileparameter ) def test_import_metric_older_version_than_tenants_package(self): @@ -2620,6 +2576,30 @@ def setUp(self): self.probeversion1_2 = probeversion1[1] self.probeversion1_3 = probeversion1[0] + @staticmethod + def _create_metric_template_history(metrictemplate, tags): + history = admin_models.MetricTemplateHistory.objects.create( + object_id=metrictemplate, + name=metrictemplate.name, + mtype=metrictemplate.mtype, + probekey=metrictemplate.probekey, + description=metrictemplate.description, + probeexecutable=metrictemplate.probeexecutable, + config=metrictemplate.config, + attribute=metrictemplate.attribute, + dependency=metrictemplate.dependency, + flags=metrictemplate.flags, + files=metrictemplate.files, + parameter=metrictemplate.parameter, + fileparameter=metrictemplate.fileparameter, + parent=metrictemplate.parent, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment='Test version.' + ) + for tag in tags: + history.tags.add(tag) + @patch('Poem.helpers.metrics_helpers.update_metrics_in_profiles') def test_update_active_metrics_from_metrictemplate_instance( self, mock_update @@ -2646,6 +2626,7 @@ def test_update_active_metrics_from_metrictemplate_instance( metrictemplate.fileparameter = '["fp-key fp-val"]' metrictemplate.save() metrictemplate.tags.remove(self.mtag2) + self._create_metric_template_history(metrictemplate, tags=[self.mtag1]) update_metric_in_schema( mt_id=metrictemplate.id, name='argo.AMS-Check', pk_id=self.probeversion1_2.id, schema=self.tenant.schema_name @@ -2658,51 +2639,50 @@ def test_update_active_metrics_from_metrictemplate_instance( metric_versions = poem_models.TenantHistory.objects.filter( object_id=metric.id, content_type=self.ct ).order_by('-date_created') + self.assertEqual(metric_versions.count(), 1) serialized_data = json.loads( metric_versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric_versions.count(), 1) self.assertEqual(metric.name, metrictemplate.name) - self.assertEqual(metric.mtype.name, metrictemplate.mtype.name) - self.assertEqual(metric.probekey, metrictemplate.probekey) - self.assertEqual(metric.description, metrictemplate.description) + self.assertEqual(metric.probeversion, metrictemplate.probekey.__str__()) self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, metrictemplate.parent) - self.assertEqual(metric.probeexecutable, metrictemplate.probeexecutable) self.assertEqual( metric.config, '["maxCheckAttempts 4", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 2"]' ) - self.assertEqual(metric.attribute, metrictemplate.attribute) - self.assertEqual(metric.dependancy, metrictemplate.dependency) - self.assertEqual(metric.flags, metrictemplate.flags) - self.assertEqual(metric.files, metrictemplate.files) - self.assertEqual(metric.parameter, metrictemplate.parameter) - self.assertEqual(metric.fileparameter, metrictemplate.fileparameter) self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [metrictemplate.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag3']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], metrictemplate.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + metrictemplate.probekey.name, + metrictemplate.probekey.package.version + ] ) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], metrictemplate.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], metrictemplate.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], metrictemplate.attribute) + self.assertEqual( + serialized_data['dependancy'], metrictemplate.dependency + ) + self.assertEqual(serialized_data['flags'], metrictemplate.flags) + self.assertEqual(serialized_data['files'], metrictemplate.files) + self.assertEqual(serialized_data['parameter'], metrictemplate.parameter) + self.assertEqual( + serialized_data['fileparameter'], metrictemplate.fileparameter + ) with schema_context('test2'): metric1 = poem_models.Metric.objects.get(name='argo.AMS-Check') metric_versions1 = poem_models.TenantHistory.objects.filter( @@ -2712,55 +2692,47 @@ def test_update_active_metrics_from_metrictemplate_instance( serialized_data1 = json.loads( metric_versions1[0].serialized_data )[0]['fields'] - self.assertEqual(metric1.name, 'argo.AMS-Check') - self.assertEqual(metric1.mtype.name, 'Active') - self.assertEqual(metric1.probekey, metrictemplate.probekey) self.assertEqual( - metric1.description, - 'Some description of argo.AMS-Check metric template.' + metric1.probeversion, metrictemplate.probekey.__str__() ) self.assertEqual(metric1.group.name, 'TEST2') - self.assertEqual(metric1.parent, '') - self.assertEqual(metric1.probeexecutable, '["ams-probe"]') self.assertEqual( metric1.config, '["maxCheckAttempts 4", "timeout 60",' ' "path /usr/libexec/argo-monitoring/probes/argo",' ' "interval 5", "retryInterval 2"]' ) - self.assertEqual(metric1.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric1.dependancy, '') - self.assertEqual(metric1.flags, '["OBSESS 1"]') - self.assertEqual(metric1.files, '') - self.assertEqual(metric1.parameter, '["--project EGI"]') - self.assertEqual(metric1.fileparameter, '') self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'], [metric1.mtype.name]) + self.assertEqual(serialized_data1['mtype'], ["Active"]) self.assertEqual( serialized_data1['tags'], [['test_tag1'], ['test_tag2'], ['test_tag3']] ) self.assertEqual( - serialized_data1['description'], metric1.description + serialized_data1['description'], + "Some description of argo.AMS-Check metric template." ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + metrictemplate.probekey.name, + metrictemplate.probekey.package.version + ] ) self.assertEqual(serialized_data1['group'], ['TEST2']) - self.assertEqual(serialized_data1['parent'], metric1.parent) + self.assertEqual(serialized_data1['parent'], "") self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], '["ams-probe"]' ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], '["argo.ams_TOKEN --token"]' ) + self.assertEqual(serialized_data1['dependancy'], "") + self.assertEqual(serialized_data1['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data1['files'], "") + self.assertEqual(serialized_data1['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data1['fileparameter'], "") @patch('Poem.helpers.metrics_helpers.update_metrics_in_profiles') def test_update_passive_metrics_from_metrictemplate_instance( @@ -2770,6 +2742,9 @@ def test_update_passive_metrics_from_metrictemplate_instance( metrictemplate = admin_models.MetricTemplate.objects.get( name='org.apel.APEL-Pub' ) + mth = admin_models.MetricTemplateHistory.objects.get( + name="org.apel.APEL-Pub" + ) metrictemplate.name = 'org.apel.APEL-Pub-new' metrictemplate.mtype = self.mt_passive metrictemplate.description = 'Added description for ' \ @@ -2786,6 +2761,20 @@ def test_update_passive_metrics_from_metrictemplate_instance( metrictemplate.fileparameter = '' metrictemplate.save() metrictemplate.tags.add(self.mtag1) + mth.name = metrictemplate.name + mth.mtype = metrictemplate.mtype + mth.description = metrictemplate.description + mth.probekey = metrictemplate.probekey + mth.parent = metrictemplate.parent + mth.probeexecutable = metrictemplate.probeexecutable + mth.config = metrictemplate.config + mth.attribute = metrictemplate.attribute + mth.dependency = metrictemplate.dependency + mth.parameter = metrictemplate.parameter + mth.flags = metrictemplate.flags + mth.files = metrictemplate.files + mth.fileparameter = metrictemplate.fileparameter + mth.save() update_metric_in_schema( mt_id=metrictemplate.id, name='org.apel.APEL-Pub', pk_id=None, schema='test' @@ -2803,38 +2792,31 @@ def test_update_passive_metrics_from_metrictemplate_instance( )[0]['fields'] self.assertEqual(metric_versions.count(), 1) self.assertEqual(metric.name, metrictemplate.name) - self.assertEqual(metric.mtype.name, metrictemplate.mtype.name) - self.assertEqual(metric.probekey, metrictemplate.probekey) - self.assertEqual(metric.description, metrictemplate.description) + self.assertEqual(metric.probeversion, metrictemplate.probekey) self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, metrictemplate.parent) - self.assertEqual(metric.probeexecutable, - metrictemplate.probeexecutable) self.assertEqual(metric.config, '') - self.assertEqual(metric.attribute, metrictemplate.attribute) - self.assertEqual(metric.dependancy, metrictemplate.dependency) - self.assertEqual(metric.flags, metrictemplate.flags) - self.assertEqual(metric.files, metrictemplate.files) - self.assertEqual(metric.parameter, metrictemplate.parameter) - self.assertEqual(metric.fileparameter, metrictemplate.fileparameter) self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [metrictemplate.mtype.name]) self.assertEqual(serialized_data['tags'], [['test_tag1']]) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], metrictemplate.description + ) self.assertEqual(serialized_data['probekey'], None) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], metrictemplate.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], metrictemplate.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) + self.assertEqual(serialized_data['attribute'], metrictemplate.attribute) self.assertEqual( - serialized_data['fileparameter'], metric.fileparameter + serialized_data['dependancy'], metrictemplate.dependency + ) + self.assertEqual(serialized_data['flags'], metrictemplate.flags) + self.assertEqual(serialized_data['files'], metrictemplate.files) + self.assertEqual(serialized_data['parameter'], metrictemplate.parameter) + self.assertEqual( + serialized_data['fileparameter'], metrictemplate.fileparameter ) with schema_context('test2'): metric1 = poem_models.Metric.objects.get( @@ -2846,38 +2828,25 @@ def test_update_passive_metrics_from_metrictemplate_instance( metric_versions1[0].serialized_data )[0]['fields'] self.assertEqual(metric_versions1.count(), 1) - self.assertEqual(metric1.mtype.name, 'Passive') - self.assertEqual(metric1.probekey, None) - self.assertEqual(metric1.description, '') + self.assertEqual(metric1.probeversion, None) self.assertEqual(metric1.group.name, 'TEST2') - self.assertEqual(metric1.parent, '') - self.assertEqual(metric1.probeexecutable, '') - self.assertEqual(metric1.config, '') - self.assertEqual(metric1.attribute, '') - self.assertEqual(metric1.dependancy, '') - self.assertEqual(metric1.flags, '["OBSESS 1", "PASSIVE 1"]') - self.assertEqual(metric1.files, '') - self.assertEqual(metric1.parameter, '') - self.assertEqual(metric1.fileparameter, '') self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data1['mtype'], ["Passive"]) self.assertEqual(serialized_data1['tags'], []) - self.assertEqual(serialized_data1['description'], metric1.description) + self.assertEqual(serialized_data1['description'], "") self.assertEqual(serialized_data1['probekey'], None) self.assertEqual(serialized_data1['group'], ['TEST2']) - self.assertEqual(serialized_data1['parent'], metric1.parent) - self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable - ) + self.assertEqual(serialized_data1['parent'], "") + self.assertEqual(serialized_data1['probeexecutable'], "") self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) + self.assertEqual(serialized_data1['attribute'], "") + self.assertEqual(serialized_data1['dependancy'], "") self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['flags'], '["OBSESS 1", "PASSIVE 1"]' ) + self.assertEqual(serialized_data1['files'], "") + self.assertEqual(serialized_data1['parameter'], "") + self.assertEqual(serialized_data1['fileparameter'], "") @patch('Poem.helpers.metrics_helpers.update_metrics_in_profiles') def test_update_active_metrics_from_metrictemplatehistory_instance( @@ -2927,41 +2896,40 @@ def test_update_active_metrics_from_metrictemplatehistory_instance( )[0]['fields'] self.assertEqual(metric_versions.count(), 2) self.assertEqual(metric.name, metrictemplate.name) - self.assertEqual(metric.mtype.name, metrictemplate.mtype.name) - self.assertEqual(metric.probekey, metrictemplate.probekey) - self.assertEqual(metric.description, metrictemplate.description) + self.assertEqual(metric.probeversion, metrictemplate.probekey.__str__()) self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, metrictemplate.parent) - self.assertEqual(metric.probeexecutable, metrictemplate.probeexecutable) self.assertEqual(metric.config, metrictemplate.config) - self.assertEqual(metric.attribute, metrictemplate.attribute) - self.assertEqual(metric.dependancy, metrictemplate.dependency) - self.assertEqual(metric.flags, metrictemplate.flags) - self.assertEqual(metric.files, metrictemplate.files) - self.assertEqual(metric.parameter, metrictemplate.parameter) - self.assertEqual(metric.fileparameter, metrictemplate.fileparameter) - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['name'], metrictemplate.name) + self.assertEqual(serialized_data['mtype'], [metrictemplate.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag2']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], metrictemplate.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + metrictemplate.probekey.name, + metrictemplate.probekey.package.version + ] ) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], metrictemplate.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], metrictemplate.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], metrictemplate.attribute) + self.assertEqual( + serialized_data['dependancy'], metrictemplate.dependency + ) + self.assertEqual(serialized_data['flags'], metrictemplate.flags) + self.assertEqual(serialized_data['files'], metrictemplate.files) + self.assertEqual(serialized_data['parameter'], metrictemplate.parameter) + self.assertEqual( + serialized_data['fileparameter'], metrictemplate.fileparameter + ) with schema_context('test2'): metric1 = poem_models.Metric.objects.get(name='argo.AMS-Check') metric_versions1 = poem_models.TenantHistory.objects.filter( @@ -2971,50 +2939,42 @@ def test_update_active_metrics_from_metrictemplatehistory_instance( metric_versions1[0].serialized_data )[0]['fields'] self.assertEqual(metric_versions1.count(), 1) - self.assertEqual(metric1.mtype.name, 'Active') - self.assertEqual(metric1.probekey, metrictemplate.probekey) self.assertEqual( - metric1.description, - 'Some description of argo.AMS-Check metric template.' + metric1.probeversion, metrictemplate.probekey.__str__() ) self.assertEqual(metric1.group.name, 'TEST2') - self.assertEqual(metric1.parent, '') - self.assertEqual( - metric1.probeexecutable, '["ams-probe"]') self.assertEqual(metric.config, metrictemplate.config) - self.assertEqual(metric1.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric1.dependancy, '') - self.assertEqual(metric1.flags, '["OBSESS 1"]') - self.assertEqual(metric1.files, '') - self.assertEqual(metric1.parameter, '["--project EGI"]') - self.assertEqual(metric1.fileparameter, '') self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'], [metric1.mtype.name]) + self.assertEqual(serialized_data1['mtype'], ["Active"]) self.assertEqual( serialized_data1['tags'], [['test_tag1'], ['test_tag2'], ['test_tag3']] ) self.assertEqual( - serialized_data1['description'], metric1.description + serialized_data1['description'], + "Some description of argo.AMS-Check metric template." ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + metrictemplate.probekey.name, + metrictemplate.probekey.package.version + ] ) self.assertEqual(serialized_data1['group'], ['TEST2']) - self.assertEqual(serialized_data1['parent'], metric1.parent) + self.assertEqual(serialized_data1['parent'], "") self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], '["ams-probe"]' ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], '["argo.ams_TOKEN --token"]' ) + self.assertEqual(serialized_data1['dependancy'], "") + self.assertEqual(serialized_data1['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data1['files'], "") + self.assertEqual(serialized_data1['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data1['fileparameter'], "") @patch('Poem.helpers.metrics_helpers.update_metrics_in_profiles') def test_update_metrics_if_different_metrictemplate_version_from_mt_inst( @@ -3024,6 +2984,9 @@ def test_update_metrics_if_different_metrictemplate_version_from_mt_inst( metrictemplate = admin_models.MetricTemplate.objects.get( name='argo.AMS-Check' ) + mth = admin_models.MetricTemplateHistory.objects.get( + name="argo.AMS-Check", probekey=self.probeversion1_3 + ) metrictemplate.mtype = self.mt_active metrictemplate.description = 'New description for the metric.' metrictemplate.probekey = self.probeversion1_3 @@ -3041,6 +3004,21 @@ def test_update_metrics_if_different_metrictemplate_version_from_mt_inst( metrictemplate.fileparameter = '["fp-key fp-val"]' metrictemplate.save() metrictemplate.tags.remove(self.mtag2) + mth.name = metrictemplate.name + mth.mtype = metrictemplate.mtype + mth.description = metrictemplate.description + mth.probekey = metrictemplate.probekey + mth.parent = metrictemplate.parent + mth.probeexecutable = metrictemplate.probeexecutable + mth.config = metrictemplate.config + mth.attribute = metrictemplate.attribute + mth.dependency = metrictemplate.dependency + mth.parameter = metrictemplate.parameter + mth.flags = metrictemplate.flags + mth.files = metrictemplate.files + mth.fileparameter = metrictemplate.fileparameter + mth.save() + mth.tags.remove(self.mtag2) update_metric_in_schema( mt_id=metrictemplate.id, name='argo.AMS-Check', pk_id=self.probeversion1_2.id, schema=self.tenant.schema_name, @@ -3056,46 +3034,45 @@ def test_update_metrics_if_different_metrictemplate_version_from_mt_inst( self.assertEqual(metric_versions.count(), 2) self.assertEqual(metric_versions[0].user, 'testuser') self.assertEqual(metric.name, metrictemplate.name) - self.assertEqual(metric.mtype.name, metrictemplate.mtype.name) - self.assertEqual(metric.probekey, metrictemplate.probekey) - self.assertEqual(metric.description, metrictemplate.description) + self.assertEqual(metric.probeversion, metrictemplate.probekey.__str__()) self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, metrictemplate.parent) - self.assertEqual(metric.probeexecutable, metrictemplate.probeexecutable) self.assertEqual( metric.config, '["maxCheckAttempts 4", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 2"]' ) - self.assertEqual(metric.attribute, metrictemplate.attribute) - self.assertEqual(metric.dependancy, metrictemplate.dependency) - self.assertEqual(metric.flags, metrictemplate.flags) - self.assertEqual(metric.files, metrictemplate.files) - self.assertEqual(metric.parameter, metrictemplate.parameter) - self.assertEqual(metric.fileparameter, metrictemplate.fileparameter) self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [metrictemplate.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag3']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], metrictemplate.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + metrictemplate.probekey.name, + metrictemplate.probekey.package.version + ] ) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], metrictemplate.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], metrictemplate.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], metrictemplate.attribute) + self.assertEqual( + serialized_data['dependancy'], metrictemplate.dependency + ) + self.assertEqual(serialized_data['flags'], metrictemplate.flags) + self.assertEqual(serialized_data['files'], metrictemplate.files) + self.assertEqual(serialized_data['parameter'], metrictemplate.parameter) + self.assertEqual( + serialized_data['fileparameter'], metrictemplate.fileparameter + ) self.assertFalse(mock_update.called) with schema_context('test2'): metric1 = poem_models.Metric.objects.get(name='argo.AMS-Check') @@ -3106,54 +3083,47 @@ def test_update_metrics_if_different_metrictemplate_version_from_mt_inst( metric_versions1[0].serialized_data )[0]['fields'] self.assertEqual(metric_versions1.count(), 1) - self.assertEqual(metric1.mtype.name, 'Active') - self.assertEqual(metric1.probekey, self.probeversion1_2) self.assertEqual( - metric1.description, - 'Some description of argo.AMS-Check metric template.' + metric1.probeversion, self.probeversion1_2.__str__() ) self.assertEqual(metric1.group.name, 'TEST2') - self.assertEqual(metric1.parent, '') - self.assertEqual(metric1.probeexecutable, '["ams-probe"]') self.assertEqual( metric1.config, '["maxCheckAttempts 4", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 2"]' ) - self.assertEqual(metric1.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric1.dependancy, '') - self.assertEqual(metric1.flags, '["OBSESS 1"]') - self.assertEqual(metric1.files, '') - self.assertEqual(metric1.parameter, '["--project EGI"]') - self.assertEqual(metric1.fileparameter, '') self.assertEqual(serialized_data1['name'], metric1.name) - self.assertEqual(serialized_data1['mtype'], [metric1.mtype.name]) + self.assertEqual(serialized_data1['mtype'], ["Active"]) self.assertEqual( serialized_data1['tags'], [['test_tag1'], ['test_tag2'], ['test_tag3']] ) self.assertEqual( - serialized_data1['description'], metric1.description + serialized_data1['description'], + "Some description of argo.AMS-Check metric template." ) self.assertEqual( serialized_data1['probekey'], - [metric1.probekey.name, metric1.probekey.package.version] + [ + self.probeversion1_2.name, + self.probeversion1_2.package.version + ] ) self.assertEqual(serialized_data1['group'], ['TEST2']) - self.assertEqual(serialized_data1['parent'], metric1.parent) + self.assertEqual(serialized_data1['parent'], "") self.assertEqual( - serialized_data1['probeexecutable'], metric1.probeexecutable + serialized_data1['probeexecutable'], '["ams-probe"]' ) self.assertEqual(serialized_data1['config'], metric1.config) - self.assertEqual(serialized_data1['attribute'], metric1.attribute) - self.assertEqual(serialized_data1['dependancy'], metric1.dependancy) - self.assertEqual(serialized_data1['flags'], metric1.flags) - self.assertEqual(serialized_data1['files'], metric1.files) - self.assertEqual(serialized_data1['parameter'], metric1.parameter) self.assertEqual( - serialized_data1['fileparameter'], metric1.fileparameter + serialized_data1['attribute'], '["argo.ams_TOKEN --token"]' ) + self.assertEqual(serialized_data1['dependancy'], "") + self.assertEqual(serialized_data1['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data1['files'], "") + self.assertEqual(serialized_data1['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data1['fileparameter'], "") self.assertFalse(mock_update.called) @patch('Poem.helpers.metrics_helpers.update_metric_in_schema') diff --git a/poem/Poem/api/tests/test_history.py b/poem/Poem/api/tests/test_history.py index 8f177b13d..fcea3b8bc 100644 --- a/poem/Poem/api/tests/test_history.py +++ b/poem/Poem/api/tests/test_history.py @@ -1,9 +1,11 @@ import datetime import json +import factory from Poem.api import views_internal as views from Poem.poem_super_admin import models as admin_models from Poem.users.models import CustUser +from django.db.models.signals import pre_save from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory from rest_framework import status @@ -11,6 +13,7 @@ class ListVersionsAPIViewTests(TenantTestCase): + @factory.django.mute_signals(pre_save) def setUp(self): self.factory = TenantRequestFactory(self.tenant) self.view = views.ListVersions.as_view() diff --git a/poem/Poem/api/tests/test_metricprofiles.py b/poem/Poem/api/tests/test_metricprofiles.py index 90ffa61c2..3ec41ce30 100644 --- a/poem/Poem/api/tests/test_metricprofiles.py +++ b/poem/Poem/api/tests/test_metricprofiles.py @@ -15,54 +15,6 @@ from .utils_test import mocked_func, encode_data -class ListServiceFlavoursAPIViewTests(TenantTestCase): - def setUp(self): - self.factory = TenantRequestFactory(self.tenant) - self.view = views.ListAllServiceFlavours.as_view() - self.url = '/api/v2/internal/serviceflavoursall/' - self.user = CustUser.objects.create(username='testuser') - - poem_models.ServiceFlavour.objects.create( - name='SRM', - description='Storage Resource Manager.' - ) - - poem_models.ServiceFlavour.objects.create( - name='org.onedata.oneprovider', - description='Oneprovider is a Onedata component...' - ) - - poem_models.ServiceFlavour.objects.create( - name='CREAM-CE', - description='[Site service] The CREAM Compute Element is the new ' - 'CE within the gLite middleware stack.' - ) - - def test_get_service_flavours(self): - request = self.factory.get(self.url) - force_authenticate(request, user=self.user) - response = self.view(request) - self.assertEqual( - response.data, - [ - { - 'name': 'CREAM-CE', - 'description': '[Site service] The CREAM Compute Element ' - 'is the new CE within the gLite middleware ' - 'stack.' - }, - { - 'name': 'org.onedata.oneprovider', - 'description': 'Oneprovider is a Onedata component...' - }, - { - 'name': 'SRM', - 'description': 'Storage Resource Manager.' - } - ] - ) - - class ListMetricProfilesAPIViewTests(TenantTestCase): def setUp(self): self.factory = TenantRequestFactory(self.tenant) diff --git a/poem/Poem/api/tests/test_metrics.py b/poem/Poem/api/tests/test_metrics.py index cce04c851..9b9b021c4 100644 --- a/poem/Poem/api/tests/test_metrics.py +++ b/poem/Poem/api/tests/test_metrics.py @@ -2,6 +2,7 @@ import json from unittest.mock import patch, call +import factory import requests from Poem.api import views_internal as views from Poem.api.internal_views.utils import inline_metric_for_db @@ -11,6 +12,8 @@ from Poem.tenants.models import Tenant from Poem.users.models import CustUser from django.contrib.contenttypes.models import ContentType +from django.core import serializers +from django.db.models.signals import pre_save from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory from django_tenants.utils import get_public_schema_name, schema_context, \ @@ -22,6 +25,7 @@ MockResponse +@factory.django.mute_signals(pre_save) def mock_db(): superuser = CustUser.objects.create_user( username='poem', is_superuser=True @@ -29,9 +33,6 @@ def mock_db(): user = CustUser.objects.create_user(username='testuser') limited_user = CustUser.objects.create_user(username='limited') - mtype1 = poem_models.MetricType.objects.create(name='Active') - mtype2 = poem_models.MetricType.objects.create(name='Passive') - mttype1 = admin_models.MetricTemplateType.objects.create(name="Active") mttype2 = admin_models.MetricTemplateType.objects.create(name="Passive") @@ -198,7 +199,7 @@ def mock_db(): fileparameter=mt1.fileparameter, date_created=datetime.datetime.now(), version_user=superuser.username, - version_comment='Initial version.' + version_comment='Updated version.' ) mt1_version2.tags.add(mtag1, mtag2) @@ -263,15 +264,9 @@ def mock_db(): metric1 = poem_models.Metric.objects.create( name=mt1_version1.name, - mtype=mtype1, - probekey=mt1_version1.probekey, - description=mt1_version1.description, + probeversion=mt1_version1.probekey.__str__(), group=group1, - probeexecutable=mt1_version1.probeexecutable, - config=mt1_version1.config, - attribute=mt1_version1.attribute, - flags=mt1_version1.flags, - parameter=mt1_version1.parameter + config=mt1_version1.config ) ct = ContentType.objects.get_for_model(poem_models.Metric) @@ -288,9 +283,7 @@ def mock_db(): metric2 = poem_models.Metric.objects.create( name=mt2.name, - flags=mt2.flags, - group=group1, - mtype=mtype2 + group=group1 ) poem_models.TenantHistory.objects.create( @@ -305,14 +298,9 @@ def mock_db(): metric3 = poem_models.Metric.objects.create( name=mt3.name, - mtype=mtype1, - probekey=mt3.probekey, - description=mt3.description, + probeversion=mt3.probekey.__str__(), group=group3, - probeexecutable=mt3.probeexecutable, - config=mt3.config, - parameter=mt3.parameter, - flags=mt3.flags + config=mt3.config ) poem_models.TenantHistory.objects.create( @@ -327,6 +315,7 @@ def mock_db(): class ListAllMetricsAPIViewTests(TenantTestCase): + @factory.django.mute_signals(pre_save) def setUp(self): self.factory = TenantRequestFactory(self.tenant) self.view = views.ListAllMetrics.as_view() @@ -336,9 +325,6 @@ def setUp(self): group = poem_models.GroupOfMetrics.objects.create(name='EGI') poem_models.GroupOfMetrics.objects.create(name='delete') - mtype1 = poem_models.MetricType.objects.create(name='Active') - mtype2 = poem_models.MetricType.objects.create(name='Passive') - tag = admin_models.OSTag.objects.create(name='CentOS 6') repo = admin_models.YumRepo.objects.create(name='repo-1', tag=tag) package = admin_models.Package.objects.create( @@ -373,15 +359,13 @@ def setUp(self): poem_models.Metric.objects.create( name='argo.AMS-Check', - mtype=mtype1, - probekey=probeversion1, - group=group, + probeversion=probeversion1.__str__(), + group=group ) poem_models.Metric.objects.create( name='org.apel.APEL-Pub', - group=group, - mtype=mtype2, + group=group ) def test_get_all_metrics(self): @@ -427,6 +411,19 @@ def setUp(self): name="ams-publisher-probe", comment="Initial version." ) + self.mt1_version1 = admin_models.MetricTemplateHistory.objects.get( + name="argo.AMS-Check", version_comment="Initial version." + ) + self.mt1_version2 = admin_models.MetricTemplateHistory.objects.get( + name="argo.AMS-Check", version_comment="Updated version." + ) + self.mt2_version1 = admin_models.MetricTemplateHistory.objects.get( + name="argo.AMSPublisher-Check", version_comment="Initial version." + ) + self.mt3 = admin_models.MetricTemplateHistory.objects.get( + name="org.apel.APEL-Pub" + ) + def test_get_metric_list(self): request = self.factory.get(self.url) force_authenticate(request, user=self.user) @@ -697,49 +694,51 @@ def test_put_metric_superuser(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Active') - self.assertEqual(metric.probekey, self.probeversion2) + self.assertEqual(metric.probeversion, self.probeversion2.__str__()) self.assertEqual(metric.group.name, 'EUDAT') - self.assertEqual( - metric.description, 'New description of argo.AMS-Check' - ) - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') self.assertEqual( metric.config, '["maxCheckAttempts 4", "timeout 70", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 6", "retryInterval 4"]') + self.assertEqual(serialized_data['name'], metric.name) self.assertEqual( - metric.attribute, - '["argo.ams_TOKEN2 --token", "mock_attribute attr"]' + serialized_data['mtype'], [self.mt1_version2.mtype.name] ) - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag2']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], self.mt1_version2.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + self.mt1_version2.probekey.name, + self.mt1_version2.probekey.package.version + ] ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt1_version2.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], + self.mt1_version2.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual( + serialized_data['attribute'], self.mt1_version2.attribute + ) + self.assertEqual( + serialized_data['dependancy'], + self.mt1_version2.dependency + ) + self.assertEqual(serialized_data['flags'], self.mt1_version2.flags) + self.assertEqual(serialized_data['files'], self.mt1_version2.files) + self.assertEqual( + serialized_data['parameter'], self.mt1_version2.parameter + ) + self.assertEqual( + serialized_data['fileparameter'], self.mt1_version2.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_metric_regular_user(self, func): @@ -786,49 +785,50 @@ def test_put_metric_regular_user(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Active') - self.assertEqual(metric.probekey, self.probeversion2) + self.assertEqual(metric.probeversion, self.probeversion2.__str__()) self.assertEqual(metric.group.name, 'ARGO') - self.assertEqual( - metric.description, 'New description of argo.AMS-Check' - ) - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') self.assertEqual( metric.config, '["maxCheckAttempts 4", "timeout 70", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 6", "retryInterval 4"]') + self.assertEqual(serialized_data['name'], metric.name) self.assertEqual( - metric.attribute, - '["argo.ams_TOKEN2 --token", "mock_attribute attr"]' + serialized_data['mtype'], [self.mt1_version2.mtype.name] ) - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag2']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], self.mt1_version2.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + self.mt1_version2.probekey.name, + self.mt1_version2.probekey.package.version + ] ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt1_version2.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], + self.mt1_version2.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual( + serialized_data['attribute'], self.mt1_version2.attribute + ) + self.assertEqual( + serialized_data['dependancy'], self.mt1_version2.dependency + ) + self.assertEqual(serialized_data['flags'], self.mt1_version2.flags) + self.assertEqual(serialized_data['files'], self.mt1_version2.files) + self.assertEqual( + serialized_data['parameter'], self.mt1_version2.parameter + ) + self.assertEqual( + serialized_data['fileparameter'], self.mt1_version2.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_metric_regular_user_wrong_group(self, func): @@ -879,49 +879,50 @@ def test_put_metric_regular_user_wrong_group(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Active') - self.assertEqual(metric.probekey, self.probeversion1) + self.assertEqual(metric.probeversion, self.probeversion1.__str__()) self.assertEqual(metric.group.name, 'EGI') - self.assertEqual( - metric.description, 'Description of argo.AMS-Check' - ) - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]') + self.assertEqual(serialized_data['name'], metric.name) self.assertEqual( - metric.attribute, - '["argo.ams_TOKEN --token"]' + serialized_data['mtype'], [self.mt1_version1.mtype.name] ) - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag2']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], self.mt1_version1.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + self.mt1_version1.probekey.name, + self.mt1_version1.probekey.package.version + ] ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt1_version1.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], + self.mt1_version1.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual( + serialized_data['attribute'], self.mt1_version1.attribute + ) + self.assertEqual( + serialized_data['dependancy'], self.mt1_version1.dependency + ) + self.assertEqual(serialized_data['flags'], self.mt1_version1.flags) + self.assertEqual(serialized_data['files'], self.mt1_version1.files) + self.assertEqual( + serialized_data['parameter'], self.mt1_version1.parameter + ) + self.assertEqual( + serialized_data['fileparameter'], self.mt1_version1.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_metric_regular_user_wrong_init_group(self, func): @@ -968,47 +969,49 @@ def test_put_metric_regular_user_wrong_init_group(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Active') - self.assertEqual(metric.probekey, self.probeversion3) + self.assertEqual(metric.probeversion, self.probeversion3.__str__()) self.assertEqual(metric.group.name, 'EUDAT') - self.assertEqual(metric.description, '') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-publisher-probe"]') self.assertEqual( metric.config, '["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', ) - self.assertEqual(metric.attribute, '') - self.assertEqual(metric.dependancy, '') + self.assertEqual(serialized_data['name'], metric.name) self.assertEqual( - metric.flags, '["NOHOSTNAME 1", "NOTIMEOUT 1", "NOPUBLISH 1"]' + serialized_data['mtype'], [self.mt2_version1.mtype.name] ) - self.assertEqual(metric.files, '') + self.assertEqual(serialized_data['tags'], [['test_tag1']]) self.assertEqual( - metric.parameter, '["-s /var/run/argo-nagios-ams-publisher/sock"]' + serialized_data['description'], self.mt2_version1.description ) - self.assertEqual(metric.fileparameter, '') - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) - self.assertEqual(serialized_data['tags'], [['test_tag1']]) - self.assertEqual(serialized_data['description'], metric.description) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + self.mt2_version1.probekey.name, + self.mt2_version1.probekey.package.version + ] ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt2_version1.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], + self.mt2_version1.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual( + serialized_data['attribute'], self.mt2_version1.attribute + ) + self.assertEqual( + serialized_data['dependancy'], self.mt2_version1.dependency + ) + self.assertEqual(serialized_data['flags'], self.mt2_version1.flags) + self.assertEqual(serialized_data['files'], self.mt2_version1.files) + self.assertEqual( + serialized_data['parameter'], self.mt2_version1.parameter + ) + self.assertEqual( + serialized_data['fileparameter'], self.mt2_version1.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_metric_limited_user(self, func): @@ -1059,49 +1062,50 @@ def test_put_metric_limited_user(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Active') - self.assertEqual(metric.probekey, self.probeversion1) + self.assertEqual(metric.probeversion, self.probeversion1.__str__()) self.assertEqual(metric.group.name, 'EGI') - self.assertEqual( - metric.description, 'Description of argo.AMS-Check' - ) - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]') + self.assertEqual(serialized_data['name'], metric.name) self.assertEqual( - metric.attribute, - '["argo.ams_TOKEN --token"]' + serialized_data['mtype'], [self.mt1_version1.mtype.name] ) - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') - self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) self.assertEqual( serialized_data['tags'], [['test_tag1'], ['test_tag2']] ) - self.assertEqual(serialized_data['description'], metric.description) + self.assertEqual( + serialized_data['description'], self.mt1_version2.description + ) self.assertEqual( serialized_data['probekey'], - [metric.probekey.name, metric.probekey.package.version] + [ + self.mt1_version1.probekey.name, + self.mt1_version1.probekey.package.version + ] ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt1_version1.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], + self.mt1_version1.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual( + serialized_data['attribute'], self.mt1_version1.attribute + ) + self.assertEqual( + serialized_data['dependancy'], self.mt1_version1.dependency + ) + self.assertEqual(serialized_data['flags'], self.mt1_version1.flags) + self.assertEqual(serialized_data['files'], self.mt1_version1.files) + self.assertEqual( + serialized_data['parameter'], self.mt1_version1.parameter + ) + self.assertEqual( + serialized_data['fileparameter'], self.mt1_version1.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_nonexisting_metric_superuser(self, func): @@ -1350,129 +1354,6 @@ def test_put_metric_nonexisting_group_limited_user(self, func): ) self.assertEqual(poem_models.Metric.objects.all().count(), 3) - @patch('Poem.api.internal_views.metrics.inline_metric_for_db') - def test_put_metric_nonexisting_type_superuser(self, func): - func.side_effect = mocked_inline_metric_for_db - conf = [ - {'key': 'maxCheckAttempts', 'value': '4'}, - {'key': 'timeout', 'value': '70'}, - {'key': 'path', - 'value': '/usr/libexec/argo-monitoring/probes/argo'}, - {'key': 'interval', 'value': '6'}, - {'key': 'retryInterval', 'value': '4'} - ] - attribute = [ - {'key': 'argo.ams_TOKEN2', 'value': '--token'}, - {'key': 'mock_attribute', 'value': 'attr'} - ] - data = { - 'name': 'argo.AMS-Check', - 'mtype': 'nonexisting', - 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', - 'description': 'New description of argo.AMS-Check', - 'parent': '', - 'probeversion': 'ams-probe (0.1.11)', - 'probeexecutable': 'ams-probe', - 'config': json.dumps(conf), - 'attribute': json.dumps(attribute), - 'dependancy': '[{"key": "", "value": ""}]', - 'flags': '[{"key": "OBSESS", "value": "1"}]', - 'files': '[{"key": "", "value": ""}]', - 'parameter': '[{"key": "", "value": ""}]', - 'fileparameter': '[{"key": "", "value": ""}]' - } - content, content_type = encode_data(data) - request = self.factory.put(self.url, content, content_type=content_type) - force_authenticate(request, user=self.superuser) - response = self.view(request) - self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) - self.assertEqual(response.data['detail'], 'Metric type does not exist.') - self.assertEqual(poem_models.Metric.objects.all().count(), 3) - - @patch('Poem.api.internal_views.metrics.inline_metric_for_db') - def test_put_metric_nonexisting_type_regular_user(self, func): - func.side_effect = mocked_inline_metric_for_db - conf = [ - {'key': 'maxCheckAttempts', 'value': '4'}, - {'key': 'timeout', 'value': '70'}, - {'key': 'path', - 'value': '/usr/libexec/argo-monitoring/probes/argo'}, - {'key': 'interval', 'value': '6'}, - {'key': 'retryInterval', 'value': '4'} - ] - attribute = [ - {'key': 'argo.ams_TOKEN2', 'value': '--token'}, - {'key': 'mock_attribute', 'value': 'attr'} - ] - data = { - 'name': 'argo.AMS-Check', - 'mtype': 'nonexisting', - 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', - 'description': 'New description of argo.AMS-Check', - 'parent': '', - 'probeversion': 'ams-probe (0.1.11)', - 'probeexecutable': 'ams-probe', - 'config': json.dumps(conf), - 'attribute': json.dumps(attribute), - 'dependancy': '[{"key": "", "value": ""}]', - 'flags': '[{"key": "OBSESS", "value": "1"}]', - 'files': '[{"key": "", "value": ""}]', - 'parameter': '[{"key": "", "value": ""}]', - 'fileparameter': '[{"key": "", "value": ""}]' - } - content, content_type = encode_data(data) - request = self.factory.put(self.url, content, content_type=content_type) - force_authenticate(request, user=self.user) - response = self.view(request) - self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) - self.assertEqual(response.data['detail'], 'Metric type does not exist.') - self.assertEqual(poem_models.Metric.objects.all().count(), 3) - - @patch('Poem.api.internal_views.metrics.inline_metric_for_db') - def test_put_metric_nonexisting_type_limited_user(self, func): - func.side_effect = mocked_inline_metric_for_db - conf = [ - {'key': 'maxCheckAttempts', 'value': '4'}, - {'key': 'timeout', 'value': '70'}, - {'key': 'path', - 'value': '/usr/libexec/argo-monitoring/probes/argo'}, - {'key': 'interval', 'value': '6'}, - {'key': 'retryInterval', 'value': '4'} - ] - attribute = [ - {'key': 'argo.ams_TOKEN2', 'value': '--token'}, - {'key': 'mock_attribute', 'value': 'attr'} - ] - data = { - 'name': 'argo.AMS-Check', - 'mtype': 'nonexisting', - 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', - 'description': 'New description of argo.AMS-Check', - 'parent': '', - 'probeversion': 'ams-probe (0.1.11)', - 'probeexecutable': 'ams-probe', - 'config': json.dumps(conf), - 'attribute': json.dumps(attribute), - 'dependancy': '[{"key": "", "value": ""}]', - 'flags': '[{"key": "OBSESS", "value": "1"}]', - 'files': '[{"key": "", "value": ""}]', - 'parameter': '[{"key": "", "value": ""}]', - 'fileparameter': '[{"key": "", "value": ""}]' - } - content, content_type = encode_data(data) - request = self.factory.put(self.url, content, content_type=content_type) - force_authenticate(request, user=self.limited_user) - response = self.view(request) - self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) - self.assertEqual( - response.data['detail'], - 'You do not have permission to change metrics.' - ) - self.assertEqual(poem_models.Metric.objects.all().count(), 3) - @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_metric_missing_key_superuser(self, func): func.side_effect = mocked_inline_metric_for_db @@ -1492,7 +1373,6 @@ def test_put_metric_missing_key_superuser(self, func): 'name': 'argo.AMS-Check', 'mtype': 'Active', 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', 'description': 'New description of argo.AMS-Check', 'parent': '', 'probeversion': 'ams-probe (0.1.11)', @@ -1509,7 +1389,7 @@ def test_put_metric_missing_key_superuser(self, func): force_authenticate(request, user=self.superuser) response = self.view(request) self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) - self.assertEqual(response.data['detail'], 'Missing data key: flags') + self.assertEqual(response.data['detail'], 'Missing data key: group') self.assertEqual(poem_models.Metric.objects.all().count(), 3) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') @@ -1531,7 +1411,6 @@ def test_put_metric_missing_key_regular_user(self, func): 'name': 'argo.AMS-Check', 'mtype': 'Active', 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', 'description': 'New description of argo.AMS-Check', 'parent': '', 'probeversion': 'ams-probe (0.1.11)', @@ -1548,7 +1427,7 @@ def test_put_metric_missing_key_regular_user(self, func): force_authenticate(request, user=self.user) response = self.view(request) self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) - self.assertEqual(response.data['detail'], 'Missing data key: flags') + self.assertEqual(response.data['detail'], 'Missing data key: group') self.assertEqual(poem_models.Metric.objects.all().count(), 3) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') @@ -1570,7 +1449,6 @@ def test_put_metric_missing_key_limited_user(self, func): 'name': 'argo.AMS-Check', 'mtype': 'Active', 'tags': ['test_tag1', 'test_tag2'], - 'group': 'EGI', 'description': 'New description of argo.AMS-Check', 'parent': '', 'probeversion': 'ams-probe (0.1.11)', @@ -1627,41 +1505,29 @@ def test_put_passive_metric_superuser(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Passive') - self.assertEqual(metric.probekey, None) + self.assertEqual(metric.probeversion, None) self.assertEqual(metric.group.name, 'EUDAT') - self.assertEqual( - metric.description, - 'Description of passive metric org.apel.APEL-Pub.' - ) - self.assertEqual(metric.parent, '["test-parent"]') - self.assertEqual(metric.probeexecutable, '') self.assertEqual(metric.config, '') - self.assertEqual(metric.attribute, '') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["PASSIVE 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [self.mt3.mtype.name]) self.assertEqual(serialized_data['tags'], []) self.assertEqual(serialized_data['probekey'], None) self.assertEqual( - serialized_data['description'], - 'Description of passive metric org.apel.APEL-Pub.' + serialized_data['description'], self.mt3.description ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt3.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], self.mt3.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], self.mt3.attribute) + self.assertEqual(serialized_data['dependancy'], self.mt3.dependency) + self.assertEqual(serialized_data['flags'], self.mt3.flags) + self.assertEqual(serialized_data['files'], self.mt3.files) + self.assertEqual(serialized_data['parameter'], self.mt3.parameter) + self.assertEqual( + serialized_data['fileparameter'], self.mt3.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_passive_metric_regular_user(self, func): @@ -1696,41 +1562,29 @@ def test_put_passive_metric_regular_user(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Passive') - self.assertEqual(metric.probekey, None) + self.assertEqual(metric.probeversion, None) self.assertEqual(metric.group.name, 'ARGO') - self.assertEqual( - metric.description, - 'Description of passive metric org.apel.APEL-Pub.' - ) - self.assertEqual(metric.parent, '["test-parent"]') - self.assertEqual(metric.probeexecutable, '') self.assertEqual(metric.config, '') - self.assertEqual(metric.attribute, '') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["PASSIVE 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [self.mt3.mtype.name]) self.assertEqual(serialized_data['tags'], []) self.assertEqual(serialized_data['probekey'], None) self.assertEqual( - serialized_data['description'], - 'Description of passive metric org.apel.APEL-Pub.' + serialized_data['description'], self.mt3.description ) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt3.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], self.mt3.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], self.mt3.attribute) + self.assertEqual(serialized_data['dependancy'], self.mt3.dependency) + self.assertEqual(serialized_data['flags'], self.mt3.flags) + self.assertEqual(serialized_data['files'], self.mt3.files) + self.assertEqual(serialized_data['parameter'], self.mt3.parameter) + self.assertEqual( + serialized_data['fileparameter'], self.mt3.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_passive_metric_regular_user_wrong_group(self, func): @@ -1769,35 +1623,27 @@ def test_put_passive_metric_regular_user_wrong_group(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Passive') - self.assertEqual(metric.probekey, None) + self.assertEqual(metric.probeversion, None) self.assertEqual(metric.group.name, 'EGI') - self.assertEqual(metric.description, '') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '') self.assertEqual(metric.config, '') - self.assertEqual(metric.attribute, '') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1", "PASSIVE 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [self.mt3.mtype.name]) self.assertEqual(serialized_data['tags'], []) self.assertEqual(serialized_data['probekey'], None) self.assertEqual(serialized_data['description'], '') - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt3.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], self.mt3.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], self.mt3.attribute) + self.assertEqual(serialized_data['dependancy'], self.mt3.dependency) + self.assertEqual(serialized_data['flags'], self.mt3.flags) + self.assertEqual(serialized_data['files'], self.mt3.files) + self.assertEqual(serialized_data['parameter'], self.mt3.parameter) + self.assertEqual( + serialized_data['fileparameter'], self.mt3.fileparameter + ) @patch('Poem.api.internal_views.metrics.inline_metric_for_db') def test_put_passive_metric_limited_user(self, func): @@ -1836,35 +1682,27 @@ def test_put_passive_metric_limited_user(self, func): serialized_data = json.loads( versions[0].serialized_data )[0]['fields'] - self.assertEqual(metric.mtype.name, 'Passive') - self.assertEqual(metric.probekey, None) + self.assertEqual(metric.probeversion, None) self.assertEqual(metric.group.name, 'EGI') - self.assertEqual(metric.description, '') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '') self.assertEqual(metric.config, '') - self.assertEqual(metric.attribute, '') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1", "PASSIVE 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') self.assertEqual(serialized_data['name'], metric.name) - self.assertEqual(serialized_data['mtype'], [metric.mtype.name]) + self.assertEqual(serialized_data['mtype'], [self.mt3.mtype.name]) self.assertEqual(serialized_data['tags'], []) self.assertEqual(serialized_data['probekey'], None) self.assertEqual(serialized_data['description'], '') - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], self.mt3.parent) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['probeexecutable'], self.mt3.probeexecutable ) self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['attribute'], self.mt3.attribute) + self.assertEqual(serialized_data['dependancy'], self.mt3.dependency) + self.assertEqual(serialized_data['flags'], self.mt3.flags) + self.assertEqual(serialized_data['files'], self.mt3.files) + self.assertEqual(serialized_data['parameter'], self.mt3.parameter) + self.assertEqual( + serialized_data['fileparameter'], self.mt3.fileparameter + ) def test_delete_metric_superuser(self): self.assertEqual(poem_models.Metric.objects.all().count(), 3) @@ -1947,29 +1785,6 @@ def test_delete_metric_without_specifying_name_limited_user(self): self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) -class ListMetricTypesAPIViewTests(TenantTestCase): - def setUp(self): - self.factory = TenantRequestFactory(self.tenant) - self.view = views.ListMetricTypes.as_view() - self.url = '/api/v2/internal/mtypes/' - self.user = CustUser.objects.create(username='testuser') - - poem_models.MetricType.objects.create(name='Active') - poem_models.MetricType.objects.create(name='Passive') - - def test_get_tags(self): - request = self.factory.get(self.url) - force_authenticate(request, user=self.user) - response = self.view(request) - self.assertEqual( - [r for r in response.data], - [ - 'Active', - 'Passive', - ] - ) - - class ImportMetricsAPIViewTests(TenantTestCase): def setUp(self): self.factory = TenantRequestFactory(self.tenant) @@ -2243,12 +2058,10 @@ def setUp(self): domain='public', tenant=tenant, is_primary=True ) - self.mtype1 = poem_models.MetricType.objects.create(name='Active') - self.mtype2 = poem_models.MetricType.objects.create(name='Passive') - self.mttype1 = admin_models.MetricTemplateType.objects.create( + self.mtype1 = admin_models.MetricTemplateType.objects.create( name='Active' ) - self.mttype2 = admin_models.MetricTemplateType.objects.create( + self.mtype2 = admin_models.MetricTemplateType.objects.create( name='Passive' ) @@ -2397,7 +2210,7 @@ def setUp(self): attribute='["argo.ams_TOKEN --token"]', parameter='["--project EGI"]', flags='["OBSESS 1"]', - mtype=self.mttype1, + mtype=self.mtype1, probekey=self.probehistory1 ) self.mt1.tags.add(self.mtag1, self.mtag2) @@ -2454,7 +2267,7 @@ def setUp(self): '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 130"]', flags='["NOHOSTNAME 1"]', - mtype=self.mttype1, + mtype=self.mtype1, probekey=self.probehistory3 ) mt2.tags.add(self.mtag2) @@ -2505,7 +2318,7 @@ def setUp(self): mt3 = admin_models.MetricTemplate.objects.create( name='org.apel.APEL-Pub', flags='["OBSESS 1", "PASSIVE 1"]', - mtype=self.mttype2 + mtype=self.mtype2 ) admin_models.MetricTemplateHistory.objects.create( @@ -2529,51 +2342,33 @@ def setUp(self): metric1 = poem_models.Metric.objects.create( name='argo.AMS-Check', - description='Description of argo.AMS-Check.', - probeexecutable='["ams-probe"]', config='["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', - attribute='["argo.ams_TOKEN --token"]', - parameter='["--project EGI"]', - flags='["OBSESS 1"]', - mtype=self.mtype1, - probekey=self.probehistory1, + probeversion=self.probehistory1.__str__(), group=self.group ) metric2 = poem_models.Metric.objects.create( name='argo.AMSPublisher-Check', - description='Description of argo.AMSPublisher-Check.', - probeexecutable='["ams-publisher-probe"]', config='["interval 180", "maxCheckAttempts 3", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 130"]', - flags='["NOHOSTNAME 1"]', - mtype=self.mtype1, - probekey=self.probehistory3, + probeversion=self.probehistory3.__str__(), group=self.group ) metric3 = poem_models.Metric.objects.create( name='emi.unicore.Gateway', - description='', - probeexecutable='["pl.plgrid/UNICORE/umi2/check_gateway/' - 'check_gateway.pl"]', config='["interval 20", "maxCheckAttempts 2", ' '"path /usr/libexec/grid-monitoring/probes", ' '"retryInterval 5", "timeout 60"]', - attribute='["METRIC_CONFIG_FILE -f"]', - flags='["OBSESS 1", "NOHOSTNAME 1", "NOLBNODE 1"]', - mtype=self.mtype1, - probekey=self.probehistory5, + probeversion=self.probehistory5.__str__(), group=self.group ) metric4 = poem_models.Metric.objects.create( name='org.apel.APEL-Pub', - flags='["OBSESS 1", "PASSIVE 1"]', - mtype=self.mtype2, group=self.group ) @@ -2599,9 +2394,30 @@ def setUp(self): user=self.user.username ) + metric3_serialized = serializers.serialize( + "json", [metric3], + use_natural_foreign_keys=True, + use_natural_primary_keys=True + ) + metric3_unserialized = json.loads(metric3_serialized)[0] + metric3_unserialized["fields"].update({ + "tags": [self.mtag1.name], + "mtype": ["Active"], + "description": "", + "parent": "", + "probeexecutable": '["pl.plgrid/UNICORE/umi2/check_gateway/' + 'check_gateway.pl"]', + "attribute": '["METRIC_CONFIG_FILE -f"]', + "dependancy": "", + "flags": '["OBSESS 1", "NOHOSTNAME 1", "NOLBNODE 1"]', + "files": "", + "parameter": "", + "fileparameter": "" + }) + poem_models.TenantHistory.objects.create( object_id=metric3.id, - serialized_data=serialize_metric(metric3, tags=[self.mtag1]), + serialized_data=json.dumps([metric3_unserialized]), object_repr=metric3.__str__(), content_type=ct, date_created=datetime.datetime.now(), @@ -2923,16 +2739,10 @@ def test_metrics_with_update_warning_and_deletion( mt1_history3.tags.add(self.mtag1, self.mtag2, self.mtag3) poem_models.Metric.objects.create( name='test.AMS-Check', - description='Description of test.AMS-Check.', - probeexecutable='["ams-probe"]', config='["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', - attribute='["argo.ams_TOKEN --token"]', - parameter='["--project EGI"]', - flags='["OBSESS 1"]', - mtype=self.mtype1, - probekey=self.probehistory1, + probeversion=self.probehistory1.__str__(), group=self.group ) data = { @@ -3022,16 +2832,10 @@ def test_metrics_with_update_warning_and_deletion_if_api_get_exception( mt1_history3.tags.add(self.mtag1, self.mtag2, self.mtag3) poem_models.Metric.objects.create( name='test.AMS-Check', - description='Description of test.AMS-Check.', - probeexecutable='["ams-probe"]', config='["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', - attribute='["argo.ams_TOKEN --token"]', - parameter='["--project EGI"]', - flags='["OBSESS 1"]', - mtype=self.mtype1, - probekey=self.probehistory1, + probeversion=self.probehistory1.__str__(), group=self.group ) data = { @@ -3124,16 +2928,10 @@ def test_metrics_with_update_warning_and_deletion_if_api_put_exception( mt1_history3.tags.add(self.mtag1, self.mtag2, self.mtag3) poem_models.Metric.objects.create( name='test.AMS-Check', - description='Description of test.AMS-Check.', - probeexecutable='["ams-probe"]', config='["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', - attribute='["argo.ams_TOKEN --token"]', - parameter='["--project EGI"]', - flags='["OBSESS 1"]', - mtype=self.mtype1, - probekey=self.probehistory1, + probeversion=self.probehistory1.__str__(), group=self.group ) data = { @@ -3428,16 +3226,10 @@ def test_metrics_with_update_warning_and_deletion_dry( mt1_history3.tags.add(self.mtag1, self.mtag2, self.mtag3) poem_models.Metric.objects.create( name='test.AMS-Check', - description='Description of test.AMS-Check.', - probeexecutable='["ams-probe"]', config='["interval 180", "maxCheckAttempts 1", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"retryInterval 1", "timeout 120"]', - attribute='["argo.ams_TOKEN --token"]', - parameter='["--project EGI"]', - flags='["OBSESS 1"]', - mtype=self.mtype1, - probekey=self.probehistory1, + probeversion=self.probehistory1.__str__(), group=self.group ) request = self.factory.get(self.url + 'nagios-plugins-argo-0.1.9') @@ -3565,7 +3357,9 @@ def setUp(self): "eosccore.ui.argo.grnet.gr org.nagios.ARGOWeb-AR " "-r EOSC_Monitoring", "argo.eosc-portal.eu org.nagios.ARGOWeb-Status -u " - "/eosc/report-status/Default/SERVICEGROUPS?accept=csv" + "/eosc/report-status/Default/SERVICEGROUPS?accept=csv", + "sensu.cro-ngi argo.AMSPublisher-Check -q " + "w:metrics+g:published180 -c 10" ] ) ) @@ -3633,6 +3427,12 @@ def test_get_configurations_super_user(self): "parameter": "-u", "value": "/eosc/report-status/Default/SERVICEGROUPS" "?accept=csv" + }, + { + "hostname": "sensu.cro-ngi", + "metric": "argo.AMSPublisher-Check", + "parameter": "-q", + "value": "w:metrics+g:published180 -c 10" } ] } @@ -3689,6 +3489,12 @@ def test_get_configuration_super_user(self): "parameter": "-u", "value": "/eosc/report-status/Default/SERVICEGROUPS" "?accept=csv" + }, + { + "hostname": "sensu.cro-ngi", + "metric": "argo.AMSPublisher-Check", + "parameter": "-q", + "value": "w:metrics+g:published180 -c 10" } ] } @@ -3752,6 +3558,12 @@ def test_put_configuration_super_user(self): "metric": "generic.http.ar-argoui", "parameter": "-r", "value": "EOSC" + }, + { + "hostname": "sensu.cro-ngi", + "metric": "argo.AMSPublisher-Check", + "parameter": "-q", + "value": "w:metrics+g:published180 -c 10" } ] } @@ -3777,11 +3589,14 @@ def test_put_configuration_super_user(self): self.assertEqual( conf.metricparameter, json.dumps([ - "eosccore.ui.argo.grnet.gr generic.http.ar-argoui -r EOSC" + "eosccore.ui.argo.grnet.gr generic.http.ar-argoui -r EOSC", + "sensu.cro-ngi argo.AMSPublisher-Check -q " + "w:metrics+g:published180 -c 10" ]) ) def test_put_configuration_regular_user(self): + self.maxDiff = None data = { "id": self.configuration1.id, "name": "local_updated", @@ -3842,7 +3657,9 @@ def test_put_configuration_regular_user(self): "eosccore.ui.argo.grnet.gr org.nagios.ARGOWeb-AR " "-r EOSC_Monitoring", "argo.eosc-portal.eu org.nagios.ARGOWeb-Status -u " - "/eosc/report-status/Default/SERVICEGROUPS?accept=csv" + "/eosc/report-status/Default/SERVICEGROUPS?accept=csv", + "sensu.cro-ngi argo.AMSPublisher-Check -q " + "w:metrics+g:published180 -c 10" ]) ) @@ -3947,7 +3764,9 @@ def test_put_configuration_with_existing_name_regular_user(self): "eosccore.ui.argo.grnet.gr org.nagios.ARGOWeb-AR " "-r EOSC_Monitoring", "argo.eosc-portal.eu org.nagios.ARGOWeb-Status -u " - "/eosc/report-status/Default/SERVICEGROUPS?accept=csv" + "/eosc/report-status/Default/SERVICEGROUPS?accept=csv", + "sensu.cro-ngi argo.AMSPublisher-Check -q " + "w:metrics+g:published180 -c 10" ]) ) diff --git a/poem/Poem/api/tests/test_metrictemplates.py b/poem/Poem/api/tests/test_metrictemplates.py index 277a4f55c..b4fc14819 100644 --- a/poem/Poem/api/tests/test_metrictemplates.py +++ b/poem/Poem/api/tests/test_metrictemplates.py @@ -49,9 +49,6 @@ def mock_db(): mttype1 = admin_models.MetricTemplateType.objects.create(name="Active") mttype2 = admin_models.MetricTemplateType.objects.create(name="Passive") - mtype1 = poem_models.MetricType.objects.create(name="Active") - mtype2 = poem_models.MetricType.objects.create(name="Passive") - ostag1 = admin_models.OSTag.objects.create(name="CentOS 6") ostag2 = admin_models.OSTag.objects.create(name="CentOS 7") @@ -532,21 +529,17 @@ def mock_db(): version_comment='Initial version.', ) + admin_models.DefaultPort.objects.create(name="SITE_BDII_PORT", value="2170") + admin_models.DefaultPort.objects.create(name="BDII_PORT", value="2170") + admin_models.DefaultPort.objects.create(name="GRAM_PORT", value="2119") + admin_models.DefaultPort.objects.create(name="MYPROXY_PORT", value="7512") + group = poem_models.GroupOfMetrics.objects.create(name="TEST") metric1 = poem_models.Metric.objects.create( name=mt4.name, - mtype=mtype1, - probekey=mt4.probekey, - description=mt4.description, - probeexecutable=mt4.probeexecutable, + probeversion=mt4.probekey.__str__(), config=mt4.config, - attribute=mt4.attribute, - dependancy=mt4.dependency, - flags=mt4.flags, - files=mt4.files, - parameter=mt4.parameter, - fileparameter=mt4.fileparameter, group=group ) @@ -562,17 +555,7 @@ def mock_db(): metric2 = poem_models.Metric.objects.create( name=mt2.name, - mtype=mtype2, - probekey=mt2.probekey, - description=mt2.description, - probeexecutable=mt2.probeexecutable, config=mt2.config, - attribute=mt2.attribute, - dependancy=mt2.dependency, - flags=mt2.flags, - files=mt2.files, - parameter=mt2.parameter, - fileparameter=mt2.fileparameter, group=group ) @@ -588,17 +571,8 @@ def mock_db(): metric3 = poem_models.Metric.objects.create( name=mt1.name, - mtype=mtype1, - probekey=mt1.probekey, - description=mt1.description, - probeexecutable=mt1.probeexecutable, + probeversion=mt1.probekey.__str__(), config=mt1.config, - attribute=mt1.attribute, - dependancy=mt1.dependency, - flags=mt1.flags, - files=mt1.files, - parameter=mt1.parameter, - fileparameter=mt1.fileparameter, group=group ) @@ -2360,7 +2334,9 @@ def test_post_metric_template_sync_error_sp_superuser( @patch("Poem.api.internal_views.metrictemplates.sync_tags_webapi") @patch('Poem.api.internal_views.metrictemplates.inline_metric_for_db') - def test_post_metric_template_sync_error_sp_user(self, mocked_inline, mock_sync): + def test_post_metric_template_sync_error_sp_user( + self, mocked_inline, mock_sync + ): mocked_inline.side_effect = mocked_inline_metric_for_db mock_sync.side_effect = mocked_syncer_error conf = [ @@ -14440,3 +14416,354 @@ def test_get_metrics4tag(self): response.data, ["argo.AMS-Check", "org.apel.APEL-Pub"] ) + + +class ListDefaultPortsTests(TenantTestCase): + def setUp(self): + self.factory = TenantRequestFactory(self.tenant) + self.view = views.ListDefaultPorts.as_view() + self.url = '/api/v2/internal/default_ports/' + + mock_db() + + self.tenant_superuser = CustUser.objects.get(username="tenant_poem") + self.tenant_user = CustUser.objects.get(username="tenant_user") + + with schema_context(get_public_schema_name()): + self.superuser = CustUser.objects.get(username="poem") + self.user = CustUser.objects.get(username='admin_user') + + self.public_tenant = Tenant.objects.get(name="public") + + self.port1 = admin_models.DefaultPort.objects.get(name="SITE_BDII_PORT") + self.port2 = admin_models.DefaultPort.objects.get(name="BDII_PORT") + self.port3 = admin_models.DefaultPort.objects.get(name="GRAM_PORT") + self.port4 = admin_models.DefaultPort.objects.get(name="MYPROXY_PORT") + + def test_get_ports(self): + request = self.factory.get(self.url) + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual( + response.data, [ + { + "id": self.port2.id, + "name": "BDII_PORT", + "value": "2170" + }, + { + "id": self.port3.id, + "name": "GRAM_PORT", + "value": "2119" + }, + { + "id": self.port4.id, + "name": "MYPROXY_PORT", + "value": "7512" + }, + { + "id": self.port1.id, + "name": "SITE_BDII_PORT", + "value": "2170" + } + ] + ) + + def test_post_port_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_201_CREATED) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 5) + port = admin_models.DefaultPort.objects.get(name="HTCondorCE_PORT") + self.assertEqual(port.value, "9619") + + def test_post_port_regular_user(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.user) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_tenant_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.tenant + force_authenticate(request, user=self.tenant_superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_tenant_regular_user(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.tenant + force_authenticate(request, user=self.tenant_user) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_multiple_ports(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "FTS_PORT", "value": "8446"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_201_CREATED) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 6) + port1 = admin_models.DefaultPort.objects.get(name="HTCondorCE_PORT") + port2 = admin_models.DefaultPort.objects.get(name="FTS_PORT") + self.assertEqual(port1.value, "9619") + self.assertEqual(port2.value, "8446") + + def test_post_port_wrong_json_format_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "val": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) + self.assertEqual( + response.data["detail"], "Wrong JSON format: Missing key 'value'" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_wrong_json_format_regular_user(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "val": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.user) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_wrong_json_format_tenant_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "val": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.tenant + force_authenticate(request, user=self.tenant_superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_wrong_json_format_tenant_regular_user(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "HTCondorCE_PORT", "val": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.tenant + force_authenticate(request, user=self.tenant_user) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) + self.assertEqual( + response.data["detail"], "You do not have permission to add ports" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_wrong_json_format_for_one_port_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "FTS_PORT", "value": "8446"}, + {"name": "HTCondorCE_PORT", "val": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) + self.assertEqual( + response.data["detail"], "Wrong JSON format: Missing key 'value'" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_wrong_json_format(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "prts": json.dumps([ + {"name": "HTCondorCE_PORT", "value": "9619"}, + {"name": "BDII_PORT", "value": "2170"}, + {"name": "GRAM_PORT", "value": "2119"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) + self.assertEqual( + response.data["detail"], "Wrong JSON format: Missing key 'ports'" + ) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="HTCondorCE_PORT" + ) + + def test_post_port_with_deletion_superuser(self): + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 4) + data = { + "ports": json.dumps([ + {"name": "BDII_PORT", "value": "2170"}, + {"name": "MYPROXY_PORT", "value": "7512"}, + {"name": "SITE_BDII_PORT", "value": "2170"} + ]) + } + request = self.factory.post(self.url, data, format="json") + request.tenant = self.public_tenant + force_authenticate(request, user=self.superuser) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_201_CREATED) + self.assertEqual(admin_models.DefaultPort.objects.all().count(), 3) + self.assertRaises( + admin_models.DefaultPort.DoesNotExist, + admin_models.DefaultPort.objects.get, + name="GRAM_PORT" + ) diff --git a/poem/Poem/api/tests/test_package.py b/poem/Poem/api/tests/test_package.py index 21a34713c..22580cc17 100644 --- a/poem/Poem/api/tests/test_package.py +++ b/poem/Poem/api/tests/test_package.py @@ -99,7 +99,6 @@ def setUp(self): ) mtype = admin_models.MetricTemplateType.objects.create(name='Active') - metrictype = poem_models.MetricType.objects.create(name='Active') group = poem_models.GroupOfMetrics.objects.create(name='TEST') @@ -144,15 +143,10 @@ def setUp(self): metric1 = poem_models.Metric.objects.create( name='argo.AMS-Check', group=group, - mtype=metrictype, - probekey=pv1, - probeexecutable='["ams-probe"]', + probeversion=pv1.__str__(), config='["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' - '"interval 5", "retryInterval 3"]', - attribute='["argo.ams_TOKEN --token"]', - flags='["OBSESS 1"]', - parameter='["--project EGI"]' + '"interval 5", "retryInterval 3"]' ) poem_models.TenantHistory.objects.create( @@ -956,7 +950,7 @@ def test_put_package_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1004,7 +998,7 @@ def test_put_package_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1052,7 +1046,7 @@ def test_put_package_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1100,7 +1094,7 @@ def test_put_package_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1143,7 +1137,7 @@ def test_put_package_with_present_version_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1191,7 +1185,7 @@ def test_put_package_with_present_version_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1239,7 +1233,7 @@ def test_put_package_with_present_version_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1287,7 +1281,7 @@ def test_put_package_with_present_version_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1332,7 +1326,7 @@ def test_put_package_with_new_repo_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1380,7 +1374,7 @@ def test_put_package_with_new_repo_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1428,7 +1422,7 @@ def test_put_package_with_new_repo_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1476,7 +1470,7 @@ def test_put_package_with_new_repo_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1614,7 +1608,7 @@ def test_put_package_with_repo_without_tag_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1662,7 +1656,7 @@ def test_put_package_with_repo_without_tag_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1710,7 +1704,7 @@ def test_put_package_with_repo_without_tag_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1758,7 +1752,7 @@ def test_put_package_with_repo_without_tag_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1803,7 +1797,7 @@ def test_put_package_with_nonexisting_repo_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1812,7 +1806,7 @@ def test_put_package_with_nonexisting_repo_sp_superuser(self): json.loads(metric_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) - def test_put_packageh_with_nonexisting_repo_sp_user(self): + def test_put_package_with_nonexisting_repo_sp_user(self): data = { 'id': self.package1.id, 'name': 'nagios-plugins-argo', @@ -1851,7 +1845,7 @@ def test_put_packageh_with_nonexisting_repo_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1899,7 +1893,7 @@ def test_put_package_with_nonexisting_repo_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1947,7 +1941,7 @@ def test_put_package_with_nonexisting_repo_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -1992,7 +1986,7 @@ def test_put_package_with_nonexisting_tag_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2040,7 +2034,7 @@ def test_put_package_with_nonexisting_tag_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2088,7 +2082,7 @@ def test_put_package_with_nonexisting_tag_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2136,7 +2130,7 @@ def test_put_package_with_nonexisting_tag_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2180,7 +2174,7 @@ def test_put_package_with_missing_data_key_sp_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2227,7 +2221,7 @@ def test_put_package_with_missing_data_key_sp_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2274,7 +2268,7 @@ def test_put_package_with_missing_data_key_tenant_superuser(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') @@ -2321,7 +2315,7 @@ def test_put_package_with_missing_data_key_tenant_user(self): self.assertEqual(mt_history.count(), 1) self.assertEqual(mt_history[0].probekey, probe_history[0]) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') - self.assertEqual(metric.probekey, probe_history[0]) + self.assertEqual(metric.probeversion, probe_history[0].__str__()) metric_history = poem_models.TenantHistory.objects.filter( object_repr=metric.__str__() ).order_by('-date_created') diff --git a/poem/Poem/api/tests/test_probes.py b/poem/Poem/api/tests/test_probes.py index 96288fe47..ef3f311de 100644 --- a/poem/Poem/api/tests/test_probes.py +++ b/poem/Poem/api/tests/test_probes.py @@ -2,12 +2,12 @@ import json from Poem.api import views_internal as views +from Poem.helpers.history_helpers import serialize_metric from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models from Poem.tenants.models import Tenant from Poem.users.models import CustUser from django.contrib.contenttypes.models import ContentType -from django.core import serializers from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory from django_tenants.utils import schema_context, get_public_schema_name, \ @@ -147,7 +147,6 @@ def setUp(self): ) mtype = admin_models.MetricTemplateType.objects.create(name='Active') - metrictype = poem_models.MetricType.objects.create(name='Active') metrictag1 = admin_models.MetricTags.objects.create(name='test_tag1') metrictag2 = admin_models.MetricTags.objects.create(name='test_tag2') @@ -169,6 +168,26 @@ def setUp(self): ) mt1.tags.add(metrictag1, metrictag2) + mth1 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt1, + name=mt1.name, + mtype=mt1.mtype, + probekey=mt1.probekey, + parent=mt1.parent, + probeexecutable=mt1.probeexecutable, + config=mt1.config, + attribute=mt1.attribute, + dependency=mt1.dependency, + flags=mt1.flags, + files=mt1.files, + parameter=mt1.parameter, + fileparameter=mt1.fileparameter, + date_created=datetime.datetime.now(), + version_comment='Initial version.', + version_user=self.user.username + ) + mth1.tags.add(metrictag1, metrictag2) + mt2 = admin_models.MetricTemplate.objects.create( name='argo.AMS-Check', mtype=mtype, @@ -183,40 +202,47 @@ def setUp(self): ) mt2.tags.add(metrictag1) + mth2 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt2, + name=mt2.name, + mtype=mt2.mtype, + probekey=mt2.probekey, + parent=mt2.parent, + probeexecutable=mt2.probeexecutable, + config=mt2.config, + attribute=mt2.attribute, + dependency=mt2.dependency, + flags=mt2.flags, + files=mt2.files, + parameter=mt2.parameter, + fileparameter=mt2.fileparameter, + date_created=datetime.datetime.now(), + version_comment='Initial version.', + version_user=self.user.username + ) + mth2.tags.add(metrictag1) + metric1 = poem_models.Metric.objects.create( name='argo.API-Check', - mtype=metrictype, group=group, - probekey=pv, - probeexecutable='["web-api"]', + probeversion=pv.__str__(), config='["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' - '"interval 5", "retryInterval 3"]', - attribute='["argo.api_TOKEN --token"]', - flags='["OBSESS 1"]' + '"interval 5", "retryInterval 3"]' ) metric2 = poem_models.Metric.objects.create( name='argo.AMS-Check', group=group, - mtype=metrictype, - probekey=pv2, - probeexecutable='["ams-probe"]', + probeversion=pv2.__str__(), config='["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' - '"interval 5", "retryInterval 3"]', - attribute='["argo.ams_TOKEN --token"]', - flags='["OBSESS 1"]', - parameter='["--project EGI"]' + '"interval 5", "retryInterval 3"]' ) poem_models.TenantHistory.objects.create( object_id=metric1.id, - serialized_data=serializers.serialize( - 'json', [metric1], - use_natural_foreign_keys=True, - use_natural_primary_keys=True - ), + serialized_data=serialize_metric(metric1, [metrictag1, metrictag2]), object_repr=metric1.__str__(), content_type=ct, date_created=datetime.datetime.now(), @@ -226,11 +252,7 @@ def setUp(self): poem_models.TenantHistory.objects.create( object_id=metric2.id, - serialized_data=serializers.serialize( - 'json', [metric2], - use_natural_foreign_keys=True, - use_natural_primary_keys=True - ), + serialized_data=serialize_metric(metric2, [metrictag1]), object_repr=metric2.__str__(), content_type=ct, date_created=datetime.datetime.now(), @@ -1457,28 +1479,18 @@ def test_put_probe_without_new_version_sp_superuser(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) @@ -1486,17 +1498,17 @@ def test_put_probe_without_new_version_sp_superuser(self): serialized_data['probekey'], ['ams-probe-new', '0.1.11'] ) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_without_new_version_sp_user(self): data = { @@ -1556,46 +1568,34 @@ def test_put_probe_without_new_version_sp_user(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_without_new_version_tenant_superuser(self): data = { @@ -1655,46 +1655,34 @@ def test_put_probe_without_new_version_tenant_superuser(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_without_new_version_tenant_user(self): data = { @@ -1754,46 +1742,34 @@ def test_put_probe_without_new_version_tenant_user(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_no_new_name_metric_history_without_new_version_sp_spusr( self @@ -1852,46 +1828,34 @@ def test_put_probe_no_new_name_metric_history_without_new_version_sp_spusr( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_no_new_name_metric_history_without_new_version_sp_user( self @@ -1953,46 +1917,34 @@ def test_put_probe_no_new_name_metric_history_without_new_version_sp_user( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_no_new_name_metric_history_without_new_version_tn_spusr( self @@ -2054,21 +2006,13 @@ def test_put_probe_no_new_name_metric_history_without_new_version_tn_spusr( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') @@ -2079,21 +2023,19 @@ def test_put_probe_no_new_name_metric_history_without_new_version_tn_spusr( serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_no_new_name_metric_history_without_new_version_tn_user( self @@ -2155,46 +2097,34 @@ def test_put_probe_no_new_name_metric_history_without_new_version_tn_user( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_without_metrictemplate_update_sp_spusr( self @@ -2261,13 +2191,11 @@ def test_put_probe_with_new_version_without_metrictemplate_update_sp_spusr( ) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') self.assertEqual( - metric.probekey, + metric.probeversion, admin_models.ProbeHistory.objects.filter( object_id=probe - ).order_by('-date_created')[1] + ).order_by('-date_created')[1].__str__() ) self.assertEqual( metric.config, @@ -2275,37 +2203,27 @@ def test_put_probe_with_new_version_without_metrictemplate_update_sp_spusr( '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_without_metrictemplate_update_sp_user( self @@ -2367,46 +2285,34 @@ def test_put_probe_with_new_version_without_metrictemplate_update_sp_user( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_without_metrictemplate_update_tn_sprusr( self @@ -2468,46 +2374,34 @@ def test_put_probe_with_new_version_without_metrictemplate_update_tn_sprusr( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_without_metrictemplate_update_tn_user( self @@ -2569,46 +2463,34 @@ def test_put_probe_with_new_version_without_metrictemplate_update_tn_user( self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_with_metrictemplate_update_sp_spruser( self @@ -2665,46 +2547,34 @@ def test_put_probe_with_new_version_with_metrictemplate_update_sp_spruser( self.assertEqual(mt.probekey, versions[0]) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, versions[1]) + self.assertEqual(metric.probeversion, versions[1].__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_with_metrictemplate_update_sp_user( self @@ -2766,46 +2636,34 @@ def test_put_probe_with_new_version_with_metrictemplate_update_sp_user( self.assertEqual(mt.probekey, versions[0]) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, versions[0]) + self.assertEqual(metric.probeversion, versions[0].__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_with_metrictemplate_update_tennt_sprusr( self @@ -2867,46 +2725,34 @@ def test_put_probe_with_new_version_with_metrictemplate_update_tennt_sprusr( self.assertEqual(mt.probekey, versions[0]) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, versions[0]) + self.assertEqual(metric.probeversion, versions[0].__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_new_version_with_metrictemplate_update_tenant_user( self @@ -2968,46 +2814,34 @@ def test_put_probe_with_new_version_with_metrictemplate_update_tenant_user( self.assertEqual(mt.probekey, versions[0]) metric = poem_models.Metric.objects.get(name='argo.API-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["web-api"]') - self.assertEqual(metric.probekey, versions[0]) + self.assertEqual(metric.probeversion, versions[0].__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.api_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.API-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['argo-web-api', '0.1.7'] - ) + self.assertEqual(serialized_data['probekey'], ['argo-web-api', '0.1.7']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["web-api"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.api_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], "") + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_with_nonexisting_probe_sp_superuser(self): data = { @@ -3161,21 +2995,13 @@ def test_put_probe_missing_data_key_sp_superuser(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') @@ -3186,21 +3012,19 @@ def test_put_probe_missing_data_key_sp_superuser(self): serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_missing_data_key_sp_user(self): data = { @@ -3258,21 +3082,13 @@ def test_put_probe_missing_data_key_sp_user(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') @@ -3283,21 +3099,19 @@ def test_put_probe_missing_data_key_sp_user(self): serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_missing_data_key_tenant_superuser(self): data = { @@ -3355,21 +3169,13 @@ def test_put_probe_missing_data_key_tenant_superuser(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') @@ -3380,21 +3186,19 @@ def test_put_probe_missing_data_key_tenant_superuser(self): serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_put_probe_missing_data_key_tenant_user(self): data = { @@ -3452,46 +3256,34 @@ def test_put_probe_missing_data_key_tenant_user(self): self.assertEqual(mt.probekey, version) metric = poem_models.Metric.objects.get(name='argo.AMS-Check') self.assertEqual(metric.group.name, 'TEST') - self.assertEqual(metric.parent, '') - self.assertEqual(metric.probeexecutable, '["ams-probe"]') - self.assertEqual(metric.probekey, version) + self.assertEqual(metric.probeversion, version.__str__()) self.assertEqual( metric.config, '["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' '"interval 5", "retryInterval 3"]' ) - self.assertEqual(metric.attribute, '["argo.ams_TOKEN --token"]') - self.assertEqual(metric.dependancy, '') - self.assertEqual(metric.flags, '["OBSESS 1"]') - self.assertEqual(metric.files, '') - self.assertEqual(metric.parameter, '["--project EGI"]') - self.assertEqual(metric.fileparameter, '') mt_history = poem_models.TenantHistory.objects.filter( object_repr='argo.AMS-Check' ).order_by('-date_created') self.assertEqual(mt_history.count(), 1) - self.assertEqual( - mt_history[0].comment, 'Initial version.' - ) + self.assertEqual(mt_history[0].comment, 'Initial version.') serialized_data = json.loads(mt_history[0].serialized_data)[0]['fields'] self.assertEqual(serialized_data['name'], metric.name) self.assertEqual(serialized_data['mtype'], ['Active']) - self.assertEqual( - serialized_data['probekey'], ['ams-probe', '0.1.11'] - ) + self.assertEqual(serialized_data['probekey'], ['ams-probe', '0.1.11']) self.assertEqual(serialized_data['group'], ['TEST']) - self.assertEqual(serialized_data['parent'], metric.parent) + self.assertEqual(serialized_data['parent'], "") + self.assertEqual(serialized_data['probeexecutable'], '["ams-probe"]') + self.assertEqual(serialized_data['config'], metric.config) self.assertEqual( - serialized_data['probeexecutable'], metric.probeexecutable + serialized_data['attribute'], '["argo.ams_TOKEN --token"]' ) - self.assertEqual(serialized_data['config'], metric.config) - self.assertEqual(serialized_data['attribute'], metric.attribute) - self.assertEqual(serialized_data['dependancy'], metric.dependancy) - self.assertEqual(serialized_data['flags'], metric.flags) - self.assertEqual(serialized_data['files'], metric.files) - self.assertEqual(serialized_data['parameter'], metric.parameter) - self.assertEqual(serialized_data['fileparameter'], metric.fileparameter) + self.assertEqual(serialized_data['dependancy'], "") + self.assertEqual(serialized_data['flags'], '["OBSESS 1"]') + self.assertEqual(serialized_data['files'], "") + self.assertEqual(serialized_data['parameter'], '["--project EGI"]') + self.assertEqual(serialized_data['fileparameter'], "") def test_post_probe_sp_superuser(self): data = { diff --git a/poem/Poem/api/tests/test_tenanthistory.py b/poem/Poem/api/tests/test_tenanthistory.py index f5752411d..615bd6cea 100644 --- a/poem/Poem/api/tests/test_tenanthistory.py +++ b/poem/Poem/api/tests/test_tenanthistory.py @@ -5,11 +5,14 @@ from Poem.helpers.history_helpers import create_comment, serialize_metric from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models +from Poem.tenants.models import Tenant from Poem.users.models import CustUser from django.contrib.contenttypes.models import ContentType from django.core import serializers from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory +from django_tenants.utils import schema_context, get_public_schema_name, \ + get_tenant_domain_model from rest_framework import status from rest_framework.test import force_authenticate @@ -21,6 +24,14 @@ def setUp(self): self.url = '/api/v2/internal/tenantversion/' self.user = CustUser.objects.create_user(username='testuser') + with schema_context(get_public_schema_name()): + public_tenant = Tenant.objects.create( + name='public', schema_name=get_public_schema_name() + ) + get_tenant_domain_model().objects.create( + domain='public', tenant=public_tenant, is_primary=True + ) + tag = admin_models.OSTag.objects.create(name='CentOS 6') repo = admin_models.YumRepo.objects.create(name='repo-1', tag=tag) package1 = admin_models.Package.objects.create( @@ -81,8 +92,12 @@ def setUp(self): version_user=self.user.username ) - self.mtype1 = poem_models.MetricType.objects.create(name='Active') - self.mtype2 = poem_models.MetricType.objects.create(name='Passive') + self.mtype1 = admin_models.MetricTemplateType.objects.create( + name="Active" + ) + self.mtype2 = admin_models.MetricTemplateType.objects.create( + name="Passive" + ) self.mtag1 = admin_models.MetricTags.objects.create(name='test_tag1') self.mtag2 = admin_models.MetricTags.objects.create(name='test_tag2') @@ -94,10 +109,9 @@ def setUp(self): poem_models.GroupOfMetricProfiles.objects.create(name='EGI') poem_models.GroupOfMetricProfiles.objects.create(name='NEW_GROUP') - self.metric1 = poem_models.Metric.objects.create( + mt1 = admin_models.MetricTemplate.objects.create( name='argo.AMS-Check', mtype=self.mtype1, - group=group1, probekey=self.probever1, probeexecutable='["ams-probe"]', config='["maxCheckAttempts 3", "timeout 60",' @@ -107,11 +121,65 @@ def setUp(self): flags='["OBSESS 1"]', parameter='["--project EGI"]' ) + mt1.tags.add(self.mtag1, self.mtag2) + + mt1_version1 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt1, + name=mt1.name, + mtype=mt1.mtype, + probekey=mt1.probekey, + parent=mt1.parent, + description=mt1.description, + probeexecutable=mt1.probeexecutable, + config=mt1.config, + attribute=mt1.attribute, + dependency=mt1.dependency, + flags=mt1.flags, + files=mt1.files, + parameter=mt1.parameter, + fileparameter=mt1.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + mt1_version1.tags.add(self.mtag1, self.mtag2) + + mt1.name = 'argo.AMS-Check-new' + mt1.probeexecutable = '["ams-probe-2"]' + mt1.probekey = self.probever2 + mt1.description = 'Description of argo.AMS-Check-new.' + mt1.parent = '["new-parent"]' + mt1.attribute = '["argo.ams_TOKEN2 --token"]' + mt1.dependency = '["new-key new-value"]' + mt1.flags = '' + mt1.parameter = '' + mt1.fileparameter = '["new-key new-value"]' + mt1.save() + + mt1_version2 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt1, + name=mt1.name, + mtype=mt1.mtype, + probekey=mt1.probekey, + parent=mt1.parent, + description=mt1.description, + probeexecutable=mt1.probeexecutable, + config=mt1.config, + attribute=mt1.attribute, + dependency=mt1.dependency, + flags=mt1.flags, + files=mt1.files, + parameter=mt1.parameter, + fileparameter=mt1.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment=create_comment(mt1) + ) + mt1_version2.tags.add(self.mtag1, self.mtag2) - poem_models.Metric.objects.create( + mt2 = admin_models.MetricTemplate.objects.create( name='test.AMS-Check', mtype=self.mtype1, - group=group1, probekey=self.probever1, description='Description of test.AMS-Check.', probeexecutable='["ams-probe"]', @@ -123,6 +191,62 @@ def setUp(self): parameter='["--project EGI"]' ) + admin_models.MetricTemplateHistory.objects.create( + object_id=mt2, + name=mt2.name, + mtype=mt2.mtype, + probekey=mt2.probekey, + parent=mt2.parent, + description=mt2.description, + probeexecutable=mt2.probeexecutable, + config=mt2.config, + attribute=mt2.attribute, + dependency=mt2.dependency, + flags=mt2.flags, + files=mt2.files, + parameter=mt2.parameter, + fileparameter=mt2.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + + mt3 = admin_models.MetricTemplate.objects.create( + name='org.apel.APEL-Pub', + description='Description of org.apel.APEL-Pub.', + mtype=self.mtype2, + flags='["OBSESS 1", "PASSIVE 1"]' + ) + + admin_models.MetricTemplateHistory.objects.create( + object_id=mt3, + name=mt3.name, + mtype=mt3.mtype, + probekey=mt3.probekey, + parent=mt3.parent, + description=mt3.description, + probeexecutable=mt3.probeexecutable, + config=mt3.config, + attribute=mt3.attribute, + dependency=mt3.dependency, + flags=mt3.flags, + files=mt3.files, + parameter=mt3.parameter, + fileparameter=mt3.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + + self.metric1 = poem_models.Metric.objects.create( + name='argo.AMS-Check', + group=group1, + probeversion=self.probever1.__str__(), + config='["maxCheckAttempts 3", "timeout 60",' + ' "path /usr/libexec/argo-monitoring/probes/argo",' + ' "interval 5", "retryInterval 3"]' + ) + self.ver1 = poem_models.TenantHistory.objects.create( object_id=self.metric1.id, serialized_data=serialize_metric( @@ -135,20 +259,21 @@ def setUp(self): user=self.user.username ) + poem_models.Metric.objects.create( + name='test.AMS-Check', + group=group1, + probeversion=self.probever1.__str__(), + config='["maxCheckAttempts 3", "timeout 60",' + ' "path /usr/libexec/argo-monitoring/probes/argo",' + ' "interval 5", "retryInterval 3"]' + ) + self.metric1.name = 'argo.AMS-Check-new' - self.metric1.probeexecutable = '["ams-probe-2"]' - self.metric1.probekey = self.probever2 - self.metric1.description = 'Description of argo.AMS-Check-new.' self.metric1.group = group2 - self.metric1.parent = '["new-parent"]' self.metric1.config = '["maxCheckAttempts 4", "timeout 70", ' \ '"path /usr/libexec/argo-monitoring/' \ 'probes/argo2", "interval 5", "retryInterval 4"]' - self.metric1.attribute = '["argo.ams_TOKEN2 --token"]' - self.metric1.dependancy = '["new-key new-value"]' - self.metric1.flags = '' - self.metric1.parameter = '' - self.metric1.fileparameter = '["new-key new-value"]' + self.metric1.probeversion = self.probever2.__str__() self.metric1.save() comment = create_comment( @@ -171,10 +296,7 @@ def setUp(self): self.metric2 = poem_models.Metric.objects.create( name='org.apel.APEL-Pub', - description='Description of org.apel.APEL-Pub.', - mtype=self.mtype2, - group=group1, - flags='["OBSESS 1", "PASSIVE 1"]' + group=group1 ) self.ver3 = poem_models.TenantHistory.objects.create( @@ -657,12 +779,6 @@ def test_get_nonexisting_metricprofile(self): self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND) self.assertEqual(response.data, {'detail': 'Metric profile not found.'}) - def test_get_metric_version_without_specifying_name(self): - request = self.factory.get(self.url + 'metricprofile') - force_authenticate(request, user=self.user) - response = self.view(request, 'metricprofile') - self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST) - def test_get_aggregation_profile_version(self): request = self.factory.get( self.url + 'aggregationprofile/TEST_PROFILE2' diff --git a/poem/Poem/api/tests/test_tenants.py b/poem/Poem/api/tests/test_tenants.py index 410c9b97e..4e0ca6dc2 100644 --- a/poem/Poem/api/tests/test_tenants.py +++ b/poem/Poem/api/tests/test_tenants.py @@ -33,9 +33,6 @@ def setUp(self) -> None: self.view = views.ListTenants.as_view() self.url = '/api/v2/internal/tenants/' self.user = CustUser.objects.create_user(username='testuser') - # get_tenant_domain_model().objects.create( - # domain='test.domain.url', tenant=self.tenant, is_primary=True - # ) with schema_context(get_public_schema_name()): self.tenant1 = Tenant(name='TEST1', schema_name='test1') diff --git a/poem/Poem/api/tests/test_utils.py b/poem/Poem/api/tests/test_utils.py index dfea75e23..951b01635 100644 --- a/poem/Poem/api/tests/test_utils.py +++ b/poem/Poem/api/tests/test_utils.py @@ -2,15 +2,18 @@ import json from unittest.mock import patch +import factory.django from Poem.api.internal_views.utils import sync_webapi, \ get_tenant_resources, sync_tags_webapi, WebApiException from Poem.api.models import MyAPIKey from Poem.helpers.history_helpers import create_comment +from Poem.helpers.history_helpers import serialize_metric from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models from Poem.users.models import CustUser from django.contrib.contenttypes.models import ContentType from django.core import serializers +from django.db.models.signals import pre_save from django_tenants.test.cases import TenantTestCase from django_tenants.utils import get_public_schema_name @@ -385,16 +388,14 @@ def test_sync_tags_with_error_without_msg(self, mock_put): ) +@factory.django.mute_signals(pre_save) class BasicResourceInfoTests(TenantTestCase): - def setUp(self) -> None: + def setUp(self): user = CustUser.objects.create_user(username='testuser') mtype1 = admin_models.MetricTemplateType.objects.create(name='Active') mtype2 = admin_models.MetricTemplateType.objects.create(name='Passive') - mtype3 = poem_models.MetricType.objects.create(name='Active') - mtype4 = poem_models.MetricType.objects.create(name='Passive') - ct = ContentType.objects.get_for_model(poem_models.Metric) tag1 = admin_models.OSTag.objects.create(name='CentOS 6') @@ -527,6 +528,7 @@ def setUp(self) -> None: name=metrictemplate1.name, mtype=metrictemplate1.mtype, probekey=metrictemplate1.probekey, + parent=metrictemplate1.parent, description=metrictemplate1.description, probeexecutable=metrictemplate1.probeexecutable, config=metrictemplate1.config, @@ -547,6 +549,7 @@ def setUp(self) -> None: mtype=metrictemplate2.mtype, description=metrictemplate2.description, probekey=metrictemplate2.probekey, + parent=metrictemplate2.parent, probeexecutable=metrictemplate2.probeexecutable, config=metrictemplate2.config, attribute=metrictemplate2.attribute, @@ -603,6 +606,7 @@ def setUp(self) -> None: mtype=metrictemplate3.mtype, description=metrictemplate3.description, probekey=metrictemplate3.probekey, + parent=metrictemplate3.parent, probeexecutable=metrictemplate3.probeexecutable, config=metrictemplate3.config, attribute=metrictemplate3.attribute, @@ -621,27 +625,14 @@ def setUp(self) -> None: metric1 = poem_models.Metric.objects.create( name=metrictemplate1.name, group=group, - mtype=mtype3, - description=metrictemplate1.description, - probekey=metrictemplate1.probekey, - probeexecutable=metrictemplate1.probeexecutable, - config=metrictemplate1.config, - attribute=metrictemplate1.attribute, - dependancy=metrictemplate1.dependency, - flags=metrictemplate1.flags, - files=metrictemplate1.files, - parameter=metrictemplate1.parameter, - fileparameter=metrictemplate1.fileparameter, + probeversion=metrictemplate1.probekey.__str__(), + config=metrictemplate1.config ) poem_models.TenantHistory.objects.create( object_id=metric1.id, object_repr=metric1.__str__(), - serialized_data=serializers.serialize( - 'json', [metric1], - use_natural_foreign_keys=True, - use_natural_primary_keys=True - ), + serialized_data=serialize_metric(metric1), content_type=ct, date_created=datetime.datetime.now(), comment='Initial version.', @@ -651,27 +642,14 @@ def setUp(self) -> None: metric2 = poem_models.Metric.objects.create( name=metrictemplate2.name, group=group, - mtype=mtype4, - description=metrictemplate2.description, - probekey=metrictemplate2.probekey, - probeexecutable=metrictemplate2.probeexecutable, - config=metrictemplate2.config, - attribute=metrictemplate2.attribute, - dependancy=metrictemplate2.dependency, - flags=metrictemplate2.flags, - files=metrictemplate2.files, - parameter=metrictemplate2.parameter, - fileparameter=metrictemplate2.fileparameter, + probeversion=metrictemplate2.probekey.__str__(), + config=metrictemplate2.config ) poem_models.TenantHistory.objects.create( object_id=metric2.id, object_repr=metric2.__str__(), - serialized_data=serializers.serialize( - 'json', [metric2], - use_natural_foreign_keys=True, - use_natural_primary_keys=True - ), + serialized_data=serialize_metric(metric2), content_type=ct, date_created=datetime.datetime.now(), comment='Initial version.', diff --git a/poem/Poem/api/tests/test_views.py b/poem/Poem/api/tests/test_views.py index aeb4eace8..2c7d0acd3 100644 --- a/poem/Poem/api/tests/test_views.py +++ b/poem/Poem/api/tests/test_views.py @@ -1,13 +1,13 @@ import datetime -from unittest.mock import patch, call import json +from unittest.mock import patch, call import factory from Poem.api import views from Poem.api.models import MyAPIKey from Poem.poem import models as poem_models from Poem.poem_super_admin import models as admin_models -from django.db.models.signals import post_save +from django.db.models.signals import post_save, pre_save from django_tenants.test.cases import TenantTestCase from django_tenants.test.client import TenantRequestFactory from rest_framework import status @@ -39,17 +39,10 @@ def mock_function(profile): return {'eu.egi.cloud.OCCI-Categories'} -@factory.django.mute_signals(post_save) +@factory.django.mute_signals(pre_save, post_save) def mock_db_for_metrics_tests(): - active_template = admin_models.MetricTemplateType.objects.create( - name='Active' - ) - passive_template = admin_models.MetricTemplateType.objects.create( - name='Passive' - ) - - active = poem_models.MetricType.objects.create(name='Active') - passive = poem_models.MetricType.objects.create(name='Passive') + active = admin_models.MetricTemplateType.objects.create(name='Active') + passive = admin_models.MetricTemplateType.objects.create(name='Passive') tag = admin_models.OSTag.objects.create(name='CentOS 6') repo = admin_models.YumRepo.objects.create(name='repo-1', tag=tag) @@ -148,7 +141,7 @@ def mock_db_for_metrics_tests(): mt1 = admin_models.MetricTemplate.objects.create( name='test.AMS-Check', - mtype=active_template, + mtype=active, probekey=probekey1, parent='["org.nagios.CDMI-TCP"]', probeexecutable='["ams-probe"]', @@ -164,27 +157,39 @@ def mock_db_for_metrics_tests(): ) mt1.tags.add(mtag1, mtag2) + mt1_version1 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt1, + name=mt1.name, + mtype=mt1.mtype, + probekey=mt1.probekey, + parent=mt1.parent, + description=mt1.description, + probeexecutable=mt1.probeexecutable, + config=mt1.config, + attribute=mt1.attribute, + dependency=mt1.dependency, + flags=mt1.flags, + files=mt1.files, + parameter=mt1.parameter, + fileparameter=mt1.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + mt1_version1.tags.add(mtag1, mtag2) + poem_models.Metric.objects.create( name='test.AMS-Check', - mtype=active, group=group, - probekey=probekey1, - parent='["org.nagios.CDMI-TCP"]', - probeexecutable='["ams-probe"]', + probeversion=probekey1.__str__(), config='["maxCheckAttempts 3", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/argo", "interval 5", ' - '"retryInterval 3"]', - attribute='["argo.ams_TOKEN --token"]', - dependancy='["argo.AMS-Check 1"]', - flags='["OBSESS 1"]', - files='["UCC_CONFIG UCC_CONFIG"]', - parameter='["--project EGI"]', - fileparameter='["FILE_SIZE_KBS 1000"]' + '"retryInterval 3"]' ) mt2 = admin_models.MetricTemplate.objects.create( name='argo.AMSPublisher-Check', - mtype=active_template, + mtype=active, probekey=probekey2, probeexecutable='["ams-publisher-probe"]', config='["maxCheckAttempts 1", "timeout 120", ' @@ -196,23 +201,39 @@ def mock_db_for_metrics_tests(): ) mt2.tags.add(mtag1, mtag3) + mt2_version1 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt2, + name=mt2.name, + mtype=mt2.mtype, + probekey=mt2.probekey, + parent=mt2.parent, + description=mt2.description, + probeexecutable=mt2.probeexecutable, + config=mt2.config, + attribute=mt2.attribute, + dependency=mt2.dependency, + flags=mt2.flags, + files=mt2.files, + parameter=mt2.parameter, + fileparameter=mt2.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + mt2_version1.tags.add(mtag1, mtag3) + poem_models.Metric.objects.create( name='argo.AMSPublisher-Check', - mtype=active, group=group, - probekey=probekey2, - probeexecutable='["ams-publisher-probe"]', + probeversion=probekey2.__str__(), config='["maxCheckAttempts 1", "timeout 120", ' '"path /usr/libexec/argo-monitoring/probes/argo", ' - '"interval 180", "retryInterval 1"]', - parameter='["-s /var/run/argo-nagios-ams-publisher/sock", ' - '"-q w:metrics+g:published180"]', - flags='["NOHOSTNAME 1", "NOTIMEOUT 1", "NOPUBLISH 1"]' + '"interval 180", "retryInterval 1"]' ) mt3 = admin_models.MetricTemplate.objects.create( name='hr.srce.CertLifetime-Local', - mtype=active_template, + mtype=active, probekey=probekey3, probeexecutable='["CertLifetime-probe"]', config='["maxCheckAttempts 2", "timeout 60", ' @@ -223,40 +244,94 @@ def mock_db_for_metrics_tests(): ) mt3.tags.add(mtag3) + mt3_version1 = admin_models.MetricTemplateHistory.objects.create( + object_id=mt3, + name=mt3.name, + mtype=mt3.mtype, + probekey=mt3.probekey, + parent=mt3.parent, + description=mt3.description, + probeexecutable=mt3.probeexecutable, + config=mt3.config, + attribute=mt3.attribute, + dependency=mt3.dependency, + flags=mt3.flags, + files=mt3.files, + parameter=mt3.parameter, + fileparameter=mt3.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + mt3_version1.tags.add(mtag3) + poem_models.Metric.objects.create( name='hr.srce.CertLifetime-Local', - mtype=active, group=group, - probekey=probekey3, - probeexecutable='["CertLifetime-probe"]', + probeversion=probekey3.__str__(), config='["maxCheckAttempts 2", "timeout 60", ' '"path /usr/libexec/argo-monitoring/probes/cert", ' - '"interval 240", "retryInterval 30"]', - attribute='["NAGIOS_HOST_CERT -f"]', - flags='["NOHOSTNAME 1", "NOPUBLISH 1"]' + '"interval 240", "retryInterval 30"]' ) - admin_models.MetricTemplate.objects.create( + mt4 = admin_models.MetricTemplate.objects.create( name='org.apel.APEL-Pub', - mtype=passive_template, + mtype=passive, flags='["OBSESS 1", "PASSIVE 1"]' ) + admin_models.MetricTemplateHistory.objects.create( + object_id=mt4, + name=mt4.name, + mtype=mt4.mtype, + probekey=mt4.probekey, + parent=mt4.parent, + description=mt4.description, + probeexecutable=mt4.probeexecutable, + config=mt4.config, + attribute=mt4.attribute, + dependency=mt4.dependency, + flags=mt4.flags, + files=mt4.files, + parameter=mt4.parameter, + fileparameter=mt4.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." + ) + poem_models.Metric.objects.create( name='org.apel.APEL-Pub', - mtype=passive, - group=group, - flags='["OBSESS 1", "PASSIVE 1"]' + group=group ) - admin_models.MetricTemplate.objects.create( + mt5 = admin_models.MetricTemplate.objects.create( name='test.EMPTY-metric', - mtype=active_template + mtype=active + ) + + admin_models.MetricTemplateHistory.objects.create( + object_id=mt5, + name=mt5.name, + mtype=mt5.mtype, + probekey=mt5.probekey, + parent=mt5.parent, + description=mt5.description, + probeexecutable=mt5.probeexecutable, + config=mt5.config, + attribute=mt5.attribute, + dependency=mt5.dependency, + flags=mt5.flags, + files=mt5.files, + parameter=mt5.parameter, + fileparameter=mt5.fileparameter, + date_created=datetime.datetime.now(), + version_user="poem", + version_comment="Initial version." ) poem_models.Metric.objects.create( - name='test.EMPTY-metric', - mtype=active + name='test.EMPTY-metric' ) poem_models.MetricConfiguration.objects.create( @@ -284,12 +359,12 @@ def mock_db_for_metrics_tests(): hostattribute="", metricparameter=json.dumps([ "eosccore.mon.devel.argo.grnet.gr argo.AMSPublisher-Check " - "-q w:metrics+g:published180" + "-q w:metrics+g:published180 -c 10" ]) ) -@factory.django.mute_signals(post_save) +@factory.django.mute_signals(pre_save, post_save) def mock_db_for_repos_tests(): tag1 = admin_models.OSTag.objects.create(name='CentOS 6') tag2 = admin_models.OSTag.objects.create(name='CentOS 7') @@ -630,111 +705,53 @@ def mock_db_for_repos_tests(): group = poem_models.GroupOfMetrics.objects.create(name='TEST') - metric_type = poem_models.MetricType.objects.create(name='Active') - poem_models.Metric.objects.create( name=mt1.name, group=group, - mtype=metric_type, - probekey=mt1.probekey, - probeexecutable=mt1.probeexecutable, - config=mt1.config, - attribute=mt1.attribute, - dependancy=mt1.dependency, - flags=mt1.flags, - files=mt1.files, - parameter=mt1.parameter, - fileparameter=mt1.fileparameter + probeversion=mt1.probekey.__str__(), + config=mt1.config ) poem_models.Metric.objects.create( name=mt2.name, group=group, - mtype=metric_type, - probekey=mt2.probekey, - probeexecutable=mt2.probeexecutable, - config=mt2.config, - attribute=mt2.attribute, - dependancy=mt2.dependency, - flags=mt2.flags, - files=mt2.files, - parameter=mt2.parameter, - fileparameter=mt2.fileparameter + probeversion=mt2.probekey.__str__(), + config=mt2.config ) poem_models.Metric.objects.create( name=mt3.name, group=group, - mtype=metric_type, - probekey=mt3.probekey, - probeexecutable=mt3.probeexecutable, - config=mt3.config, - attribute=mt3.attribute, - dependancy=mt3.dependency, - flags=mt3.flags, - files=mt3.files, - parameter=mt3.parameter, - fileparameter=mt3.fileparameter + probeversion=mt3.probekey.__str__(), + config=mt3.config ) poem_models.Metric.objects.create( name=mt4.name, group=group, - mtype=metric_type, - probekey=mt4.probekey, - probeexecutable=mt4.probeexecutable, - config=mt4.config, - attribute=mt4.attribute, - dependancy=mt4.dependency, - flags=mt4.flags, - files=mt4.files, - parameter=mt4.parameter, - fileparameter=mt4.fileparameter + probeversion=mt4.probekey.__str__(), + config=mt4.config ) poem_models.Metric.objects.create( name=mt5.name, group=group, - mtype=metric_type, - probekey=mt5.probekey, - probeexecutable=mt5.probeexecutable, - config=mt5.config, - attribute=mt5.attribute, - dependancy=mt5.dependency, - flags=mt5.flags, - files=mt5.files, - parameter=mt5.parameter, - fileparameter=mt5.fileparameter + probeversion=None, + config=mt5.config ) poem_models.Metric.objects.create( name=mt6.name, group=group, - mtype=metric_type, - probekey=probehistory5, - probeexecutable=mt6.probeexecutable, - config=mt6.config, - attribute=mt6.attribute, - dependancy=mt6.dependency, - flags=mt6.flags, - files=mt6.files, - parameter=mt6.parameter, - fileparameter=mt6.fileparameter + probeversion=probehistory5.__str__(), + config=mt6.config ) poem_models.Metric.objects.create( name=mt7.name, group=group, - mtype=metric_type, - probekey=probehistory7, - probeexecutable=mt7.probeexecutable, - config=mt7.config, - attribute=mt7.attribute, - dependancy=mt7.dependency, - flags=mt7.flags, - files=mt7.files, - parameter=mt7.parameter, - fileparameter=mt7.fileparameter + probeversion=probehistory7.__str__(), + config=mt7.config ) @@ -1425,7 +1442,7 @@ def test_list_metric_overrides(self): "hostname": "eosccore.mon.devel.argo.grnet.gr", "metric": "argo.AMSPublisher-Check", "parameter": "-q", - "value": "w:metrics+g:published180" + "value": "w:metrics+g:published180 -c 10" } ] }, @@ -1465,3 +1482,40 @@ def test_list_metric_overrides(self): } } ) + + +class ListDefaultPortsTests(TenantTestCase): + def setUp(self): + self.token = create_credentials() + self.view = views.ListDefaultPorts.as_view() + self.factory = TenantRequestFactory(self.tenant) + self.url = '/api/v2/metricoverrides' + + admin_models.DefaultPort.objects.create( + name="SITE_BDII_PORT", value="2170" + ) + admin_models.DefaultPort.objects.create(name="BDII_PORT", value="2170") + admin_models.DefaultPort.objects.create(name="GRAM_PORT", value="2119") + admin_models.DefaultPort.objects.create( + name="MYPROXY_PORT", value="7512" + ) + + def test_list_default_ports_if_wrong_token(self): + request = self.factory.get( + self.url, **{'HTTP_X_API_KEY': 'wrong_token'} + ) + response = self.view(request) + self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN) + + def test_list_default_ports(self): + request = self.factory.get(self.url, **{'HTTP_X_API_KEY': self.token}) + response = self.view(request) + self.assertEqual( + response.data, + { + "BDII_PORT": "2170", + "GRAM_PORT": "2119", + "MYPROXY_PORT": "7512", + "SITE_BDII_PORT": "2170" + } + ) diff --git a/poem/Poem/api/urls.py b/poem/Poem/api/urls.py index 47e526616..ac76a4d10 100644 --- a/poem/Poem/api/urls.py +++ b/poem/Poem/api/urls.py @@ -14,5 +14,6 @@ path('metrictemplates/', views.ListMetricTemplates.as_view()), path('metrictemplates/', views.ListMetricTemplates.as_view()), path('metricoverrides/', views.ListMetricOverrides.as_view()), + path('default_ports/', views.ListDefaultPorts.as_view()), path('internal/', include('Poem.api.urls_internal', namespace='internal')) ] diff --git a/poem/Poem/api/urls_internal.py b/poem/Poem/api/urls_internal.py index 16a257b4a..240c966a2 100644 --- a/poem/Poem/api/urls_internal.py +++ b/poem/Poem/api/urls_internal.py @@ -46,8 +46,6 @@ path('metrictemplates-import/', views_internal.ListMetricTemplatesForImport.as_view(), name='metrictemplates-import'), path('mttypes/', views_internal.ListMetricTemplateTypes.as_view(), name='mttypes'), path('public_mttypes/', views_internal.ListPublicMetricTemplateTypes.as_view(), name='mttypes'), - path('mtypes/', views_internal.ListMetricTypes.as_view(), name='mtypes'), - path('public_mtypes/', views_internal.ListPublicMetricTypes.as_view(), name='mtypes'), path('ostags/', views_internal.ListOSTags.as_view(), name='ostags'), path('public_ostags/', views_internal.ListPublicOSTags.as_view(), name='ostags'), path('packages/', views_internal.ListPackages.as_view(), name='packages'), @@ -59,9 +57,6 @@ path('probes/', views_internal.ListProbes.as_view(), name='probes'), path('public_probes/', views_internal.ListPublicProbes.as_view(), name='probes'), path('saml2login', views_internal.Saml2Login.as_view(), name='saml2login'), - path('serviceflavoursall/', views_internal.ListAllServiceFlavours.as_view(), name='serviceflavoursall'), - path('servicetypesdesc/', views_internal.ListServiceTypesDescriptions.as_view(), name='servicetypesdesc'), - path('public_servicetypesdesc/', views_internal.ListPublicServiceTypesDescriptions.as_view(), name='servicetypesdesc'), path('sessionactive/', views_internal.IsSessionActive.as_view(), name='sessionactive'), path('tenantversion//', views_internal.ListTenantVersions.as_view(), name='tenantversions'), path('thresholdsprofiles/', views_internal.ListThresholdsProfiles.as_view(), name='thresholdsprofiles'), @@ -98,5 +93,6 @@ path('reportsgroup/', views_internal.ListReportsInGroup.as_view(), name='reports'), path('reportsgroup/', views_internal.ListReportsInGroup.as_view(), name='reports'), path("metricconfiguration/", views_internal.ListMetricConfiguration.as_view(), name="metricconfiguration"), - path("metricconfiguration/", views_internal.ListMetricConfiguration.as_view(), name="metricconfiguration") + path("metricconfiguration/", views_internal.ListMetricConfiguration.as_view(), name="metricconfiguration"), + path("default_ports/", views_internal.ListDefaultPorts.as_view(), name="default_ports") ] diff --git a/poem/Poem/api/views.py b/poem/Poem/api/views.py index f1b7deb19..91b95a384 100644 --- a/poem/Poem/api/views.py +++ b/poem/Poem/api/views.py @@ -1,5 +1,6 @@ -import requests import json + +import requests from Poem.api.internal_views.utils import one_value_inline, \ two_value_inline_dict from Poem.api.models import MyAPIKey @@ -34,33 +35,54 @@ def build_metricconfigs(templates=False): mdict.update({m.name: dict()}) config = two_value_inline_dict(m.config) - parent = one_value_inline(m.parent) - probeexecutable = one_value_inline(m.probeexecutable) - attribute = two_value_inline_dict(m.attribute) + if templates: + parent = one_value_inline(m.parent) + probeexecutable = one_value_inline(m.probeexecutable) + attribute = two_value_inline_dict(m.attribute) dependency = two_value_inline_dict(m.dependency) - else: - dependency = two_value_inline_dict(m.dependancy) - flags = two_value_inline_dict(m.flags) - files = two_value_inline_dict(m.files) - parameter = two_value_inline_dict(m.parameter) - fileparameter = two_value_inline_dict(m.fileparameter) + flags = two_value_inline_dict(m.flags) + files = two_value_inline_dict(m.files) + parameter = two_value_inline_dict(m.parameter) + fileparameter = two_value_inline_dict(m.fileparameter) - if templates: - mdict[m.name].update( - {'tags': sorted([tag.name for tag in m.tags.all()])} - ) + tags = sorted([tag.name for tag in m.tags.all()]) + + docurl = m.probekey.docurl if m.probekey else "" else: - tags = [] try: - mt = admin_models.MetricTemplate.objects.get(name=m.name) + if m.probeversion: + probeversion = m.probeversion.split("(") + probe_name = probeversion[0].strip() + probe_version = probeversion[1][:-1].strip() + probe = admin_models.ProbeHistory.objects.get( + name=probe_name, package__version=probe_version + ) + mt = admin_models.MetricTemplateHistory.objects.get( + name=m.name, probekey=probe + ) + + else: + mt = admin_models.MetricTemplate.objects.get(name=m.name) + + parent = one_value_inline(mt.parent) + probeexecutable = one_value_inline(mt.probeexecutable) + attribute = two_value_inline_dict(mt.attribute) + dependency = two_value_inline_dict(mt.dependency) + flags = two_value_inline_dict(mt.flags) + files = two_value_inline_dict(mt.files) + parameter = two_value_inline_dict(mt.parameter) + fileparameter = two_value_inline_dict(mt.fileparameter) + tags = sorted([tag.name for tag in mt.tags.all()]) - except admin_models.MetricTemplate.DoesNotExist: - pass + docurl = mt.probekey.docurl if mt.probekey else "" + + except admin_models.MetricTemplateHistory.DoesNotExist: + continue - mdict[m.name].update({"tags": tags}) + mdict[m.name].update({"tags": tags}) if probeexecutable: mdict[m.name].update({'probe': probeexecutable}) @@ -107,11 +129,8 @@ def build_metricconfigs(templates=False): else: mdict[m.name].update({'parent': ''}) - if m.probekey: - docurl_field = m.probekey.docurl - mdict[m.name].update( - {'docurl': docurl_field} - ) + if docurl: + mdict[m.name].update({'docurl': docurl}) else: mdict[m.name].update({'docurl': ''}) @@ -221,8 +240,14 @@ def get(self, request, tag=None): for metric in metrics: try: m = models.Metric.objects.get(name=metric) - if m.probekey: - packages.add(m.probekey.package) + if m.probeversion: + probeversion = m.probeversion.split("(") + probe_name = probeversion[0].strip() + probe_version = probeversion[1][:-1].strip() + probe = admin_models.ProbeHistory.objects.get( + name=probe_name, package__version=probe_version + ) + packages.add(probe.package) except models.Metric.DoesNotExist: pass @@ -356,7 +381,7 @@ def _get_metric_parameters(db_entry): value = "" else: - value = split[3] + value = " ".join(split[3:]) results.append({ "hostname": split[0], @@ -387,3 +412,17 @@ def get(self, request): }) return Response(overrides) + + +class ListDefaultPorts(APIView): + permission_classes = (MyHasAPIKey,) + + def get(self, request): + data = admin_models.DefaultPort.objects.all() + + results = dict() + + for item in data: + results.update({item.name: item.value}) + + return Response(results) diff --git a/poem/Poem/api/views_internal.py b/poem/Poem/api/views_internal.py index 1625173e1..feff87628 100644 --- a/poem/Poem/api/views_internal.py +++ b/poem/Poem/api/views_internal.py @@ -10,7 +10,6 @@ from Poem.api.internal_views.package import * from Poem.api.internal_views.probes import * from Poem.api.internal_views.reports import * -from Poem.api.internal_views.servicetypes import * from Poem.api.internal_views.tenanthistory import * from Poem.api.internal_views.tenants import * from Poem.api.internal_views.thresholdsprofiles import * diff --git a/poem/Poem/frontend/react/APIKey.js b/poem/Poem/frontend/react/APIKey.js index 628003442..dafb0c361 100644 --- a/poem/Poem/frontend/react/APIKey.js +++ b/poem/Poem/frontend/react/APIKey.js @@ -12,7 +12,6 @@ import { ParagraphTitle, BaseArgoTable } from './UIElements'; -import { Formik, Form, Field } from 'formik'; import { Alert, FormGroup, @@ -23,11 +22,13 @@ import { Button, InputGroup, InputGroupText, - Input + Input, + Form } from 'reactstrap'; import { faClipboard } from '@fortawesome/free-solid-svg-icons'; import { useQuery, useQueryClient, useMutation } from 'react-query'; import { fetchAPIKeys } from './QueryFunctions'; +import { Controller, useForm } from 'react-hook-form'; const fetchAPIKey = async(name) => { @@ -125,12 +126,7 @@ export const APIKeyList = (props) => { } -export const APIKeyChange = (props) => { - const name = props.match.params.name; - const location = props.location; - const addview = props.addview; - const history = props.history; - +const APIKeyForm = ({ name, data, addview, history, location }) => { const backend = new Backend(); const queryClient = useQueryClient(); @@ -143,18 +139,16 @@ export const APIKeyChange = (props) => { const [modalTitle, setModalTitle] = useState(undefined) const [modalMsg, setModalMsg] = useState(undefined) const [onYes, setOnYes] = useState('') - const [formikValues, setFormikValues] = useState({}) const refToken = useRef(null); - const { data: key, error: error, status: status } = useQuery( - ['apikey', name], () => fetchAPIKey(name), - { enabled: !addview } - ) + const { control, handleSubmit, setValue, getValues } = useForm({ + defaultValues: !addview ? data : { name: "", revoked: false, token: "" } + }) const doChange = (values) => { if (!addview) { changeMutation.mutate( - { id: key.id, revoked: values.revoked, name: values.name }, { + { id: values.id, revoked: values.revoked, name: values.name }, { onSuccess: () => { queryClient.invalidateQueries('apikey'); NotifyOk({ @@ -191,7 +185,7 @@ export const APIKeyChange = (props) => { } }; - const onSubmitHandle = (values) => { + const onSubmitHandle = () => { let msg = `Are you sure you want to ${addview ? 'add' : 'change'} API key?`; let title = `${addview ? 'Add' : 'Change'}`; @@ -199,7 +193,6 @@ export const APIKeyChange = (props) => { setModalMsg(msg) setModalTitle(title) setOnYes('change') - setFormikValues(values) } const doDelete = () => { @@ -225,7 +218,7 @@ export const APIKeyChange = (props) => { if (onYes === 'delete') doDelete(name); else if (onYes === 'change') - doChange(formikValues); + doChange(getValues()); } const copyToClipboard = (e) => { @@ -237,134 +230,159 @@ export const APIKeyChange = (props) => { }); } + return ( + setAreYouSureModal(!areYouSureModal)}> +
+ + + + + + + } + /> + + A free-form unique identifier of the client. 50 characters max. + + + + + + + + { + return ( + setValue("revoked", e.target.checked)} + checked={field.value} + /> + + ) + }} + /> + + + + + + If the API key is revoked, clients cannot use it any more. (This cannot be undone.) + + + + + + + + { + addview && + + If token field is left empty, value will be automatically generated on save. + + } + + + + Token + + + } + /> + + + A public, unique identifier for this API key. + + + { + !addview && + + + + } + + + { +
+ { + (!addview) && + + } + +
+ } +
+
+ ) +} + + +export const APIKeyChange = (props) => { + const name = props.match.params.name; + const location = props.location; + const addview = props.addview; + const history = props.history; + + + const { data: key, error: error, status: status } = useQuery( + ['apikey', name], () => fetchAPIKey(name), + { enabled: !addview } + ) + if (status === 'loading') return (); else if (status === 'error') return (); - else + else if (key || addview) return ( - setAreYouSureModal(!areYouSureModal)}> - onSubmitHandle(values)} - > - {(props) => ( -
- - - - - - - A free-form unique identifier of the client. 50 characters max. - - - - - - - - props.setFieldValue('revoked', e.target.checked)} - checked={props.values.revoked} - /> - - - - - - If the API key is revoked, clients cannot use it any more. (This cannot be undone.) - - - - - - - - { - addview && - - If token field is left empty, value will be automatically generated on save. - - } - - - - Token - - - - A public, unique identifier for this API key. - - - { - !addview && - - - - } - - - { -
- { - (!addview) && - - } - -
- } -
- )} -
-
+ ) + else + return null } diff --git a/poem/Poem/frontend/react/Administration.js b/poem/Poem/frontend/react/Administration.js index 2ff19c9ed..0beda94e9 100644 --- a/poem/Poem/frontend/react/Administration.js +++ b/poem/Poem/frontend/react/Administration.js @@ -119,6 +119,9 @@ export const SuperAdminAdministration = () =>
Metric templates
+
+ Default ports +
diff --git a/poem/Poem/frontend/react/App.js b/poem/Poem/frontend/react/App.js index 882495cae..7fb9dc411 100644 --- a/poem/Poem/frontend/react/App.js +++ b/poem/Poem/frontend/react/App.js @@ -49,7 +49,7 @@ import { ThresholdsProfilesList, ThresholdsProfilesChange, ThresholdsProfileVers import './App.css'; import { PackageList, PackageComponent } from './Package'; -import { ServiceTypesList } from './ServiceTypes'; +import { ServiceTypesListPublic, ServiceTypesList, ServiceTypesBulkAdd } from './ServiceTypes'; import { TenantList, TenantChange } from './Tenants'; import { OperationsProfilesList, OperationsProfileDetails } from './OperationsProfiles'; import { CookiePolicy } from './CookiePolicy'; @@ -63,7 +63,6 @@ import { fetchMetricTags, fetchMetricTemplates, fetchMetricTemplateTypes, - fetchMetricTypes, fetchOperationsProfiles, fetchOStags, fetchPackages, @@ -80,10 +79,12 @@ import { fetchThresholdsProfiles, fetchTopologyTags, fetchTopologyGroups, - fetchTopologyEndpoints + fetchTopologyEndpoints, + fetchServiceTypes } from './QueryFunctions'; import { MetricTagsComponent, MetricTagsList } from './MetricTags'; import { MetricOverrideChange, MetricOverrideList } from './MetricOverrides'; +import { DefaultPortsList } from './DefaultPorts'; const NavigationBarWithRouter = withRouter(NavigationBar); @@ -143,7 +144,7 @@ const RedirectAfterLogin = ({isSuperUser}) => { } -const TenantRouteSwitch = ({webApiAggregation, webApiMetric, webApiThresholds, webApiOperations, webApiReports, token, tenantName, isSuperUser, userGroups}) => ( +const TenantRouteSwitch = ({webApiAggregation, webApiMetric, webApiThresholds, webApiOperations, webApiReports, webApiServiceTypes, token, tenantName, isSuperUser, userGroups}) => ( }/> @@ -187,12 +188,14 @@ const TenantRouteSwitch = ({webApiAggregation, webApiMetric, webApiThresholds, w webapimetric={webApiMetric} webapitoken={token} tenantname={tenantName} + webapiservicetypes={webApiServiceTypes} addview={true}/>} /> } /> @@ -200,6 +203,7 @@ const TenantRouteSwitch = ({webApiAggregation, webApiMetric, webApiThresholds, w render={props => } /> @@ -537,7 +541,20 @@ const TenantRouteSwitch = ({webApiAggregation, webApiMetric, webApiThresholds, w /> } + /> + } /> @@ -574,6 +591,7 @@ const SuperAdminRouteSwitch = () => ( }/> }/> + { const [webApiMetric, setWebApiMetric] = useState(undefined); const [webApiThresholds, setWebApiThresholds] = useState(undefined); const [webApiOperations, setWebApiOperations] = useState(undefined); + const [webApiServiceTypes, setWebApiServiceTypes] = useState(undefined); const [webApiReports, setWebApiReports] = useState(undefined); const [publicView, setPublicView] = useState(undefined); const [tenantName, setTenantName] = useState(undefined); @@ -657,6 +676,7 @@ const App = () => { setWebApiThresholds(options && options.result.webapithresholds); setWebApiOperations(options && options.result.webapioperations); setWebApiReports(options && options.result.webapireports); + setWebApiServiceTypes(options && options.result.webapiservicetypes); setTenantName(options && options.result.tenant_name); } options && prefetchData(false, poemType, options, poemType ? response.userdetails.token : null) @@ -675,6 +695,7 @@ const App = () => { setWebApiThresholds(options && options.result.webapithresholds); setWebApiOperations(options && options.result.webapioperations); setWebApiReports(options && options.result.webapireports); + setWebApiServiceTypes(options && options.result.webapiservicetypes); setPrivacyLink(options && options.result.terms_privacy_links.privacy); setTermsLink(options && options.result.terms_privacy_links.terms); setTenantName(options && options.result.tenant_name); @@ -724,7 +745,8 @@ const App = () => { aggregationProfiles: options.result.webapiaggregation, thresholdsProfiles: options.result.webapithresholds, operationsProfiles: options.result.webapioperations, - reportsConfigurations: options.result.webapireports + reportsConfigurations: options.result.webapireports, + serviceTypes: options.result.webapiservicetypes }) if (!isPublic) @@ -735,9 +757,6 @@ const App = () => { queryClient.prefetchQuery( `${isPublic ? 'public_' : ''}metric`, () => fetchMetrics(isPublic) ); - queryClient.prefetchQuery( - `${isPublic ? 'public_' : ''}metricstypes`, () => fetchMetricTypes(isPublic) - ); queryClient.prefetchQuery( [`${isPublic ? 'public_' : ''}metric`, 'usergroups'], () => fetchUserGroups(isTenant, isPublic, 'metrics') ) @@ -765,6 +784,9 @@ const App = () => { queryClient.prefetchQuery( [`${isPublic ? 'public_' : ''}thresholdsprofile`, 'webapi'], () => fetchThresholdsProfiles(webapi) ) + queryClient.prefetchQuery( + [`${isPublic ? 'public_' : ''}servicetypes`, 'webapi'], () => fetchServiceTypes(webapi) + ) queryClient.prefetchQuery( `${isPublic ? 'public_' : ''}operationsprofile`, () => fetchOperationsProfiles(webapi) ); @@ -807,20 +829,21 @@ const App = () => { localStorage.setItem('referrer', JSON.stringify(stackUrls)); } + async function fetchData() { + if (isPublicUrl()) { + initalizePublicState() + } + else { + let isTenantSchema = await backend.isTenantSchema(); + let response = await backend.isActiveSession(isTenantSchema); + response.active && initalizeState(isTenantSchema, response); + } + + getAndSetReferrer() + } + useEffect(() => { fetchData(); - async function fetchData() { - if (isPublicUrl()) { - initalizePublicState() - } - else { - let isTenantSchema = await backend.isTenantSchema(); - let response = await backend.isActiveSession(isTenantSchema); - response.active && initalizeState(isTenantSchema, response); - } - - getAndSetReferrer() - } }, []) if (publicView && privacyLink && termsLink && isTenantSchema !== undefined) { @@ -1074,9 +1097,10 @@ const App = () => { - } @@ -1256,6 +1280,7 @@ const App = () => { webApiThresholds={webApiThresholds} webApiOperations={webApiOperations} webApiReports={webApiReports} + webApiServiceTypes={webApiServiceTypes} token={token} tenantName={tenantName} isSuperUser={userDetails.is_superuser} diff --git a/poem/Poem/frontend/react/DataManager.js b/poem/Poem/frontend/react/DataManager.js index f5ff6aec7..608e8734d 100644 --- a/poem/Poem/frontend/react/DataManager.js +++ b/poem/Poem/frontend/react/DataManager.js @@ -262,7 +262,8 @@ export class WebApi { aggregationProfiles=undefined, thresholdsProfiles=undefined, operationsProfiles=undefined, - reportsConfigurations=undefined + reportsConfigurations=undefined, + serviceTypes=undefined }) { this.token = token; this.metricprofiles = metricProfiles; @@ -270,6 +271,7 @@ export class WebApi { this.thresholdsprofiles = thresholdsProfiles; this.operationsprofiles = operationsProfiles; this.reports = reportsConfigurations; + this.servicetypes = serviceTypes; } async fetchProfiles(url) { @@ -334,6 +336,10 @@ export class WebApi { return this.fetchProfiles(this.reports['tags']); } + async fetchServiceTypes() { + return this.fetchProfiles(this.servicetypes); + } + fetchMetricProfile(id) { return this.fetchProfile(`${this.metricprofiles}/${id}`); } @@ -394,6 +400,22 @@ export class WebApi { return this.addProfile(this.reports['main'], report); } + async addServiceTypes(service_types) { + try { + await this.addProfile(this.servicetypes, service_types); + } + catch (err) { + if (err.message.includes('409')) + try { + await this.deleteProfile(this.servicetypes); + await this.addProfile(this.servicetypes, service_types); + } + catch (err2) { + throw Error(err2) + } + } + } + changeReport(report) { return this.changeProfile(`${this.reports['main']}`, report); } diff --git a/poem/Poem/frontend/react/DefaultPorts.js b/poem/Poem/frontend/react/DefaultPorts.js new file mode 100644 index 000000000..4e164074d --- /dev/null +++ b/poem/Poem/frontend/react/DefaultPorts.js @@ -0,0 +1,346 @@ +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import React, { useEffect, useState } from 'react'; +import { useMutation, useQuery, useQueryClient } from 'react-query'; +import { + Form, + Row, + Col, + Table, + Button, + Input, + InputGroup, + FormFeedback, + Alert +} from 'reactstrap'; +import { Backend } from './DataManager'; +import { + LoadingAnim, + ErrorComponent, + SearchField, + NotifyOk, + ModalAreYouSure, + NotifyError +} from './UIElements'; +import { + faSearch, + faTimes, + faPlus, + faInfoCircle +} from '@fortawesome/free-solid-svg-icons'; +import { + Controller, + useFieldArray, + useForm, + useWatch +} from 'react-hook-form'; +import { ErrorMessage } from '@hookform/error-message'; +import { yupResolver } from '@hookform/resolvers/yup'; +import * as Yup from "yup"; + + +const validationSchema = Yup.object().shape({ + defaultPorts: Yup.array().of( + Yup.object().shape({ + name: Yup.string() + .required("This field is required") + .matches(/^[a-zA-Z][A-Za-z0-9\-_]*$/, "Name can contain alphanumeric characters, dash and underscore, but must always begin with a letter") + .test("unique", "Duplicate", function (value) { + let arr = this.options.context.map(e => e.name) + if (arr.indexOf(value) === arr.lastIndexOf(value)) + return true + + else + return false + }), + value: Yup.string().required("This field is required").matches(/^[0-9]*$/, "Value must contain numeric characters") + }) + ) +}) + + +const PortsList = ({ data }) => { + const backend = new Backend() + + const queryClient = useQueryClient() + + const mutation = useMutation(async (values) => await backend.addObject("/api/v2/internal/default_ports/", values)) + + const [areYouSureModal, setAreYouSureModal] = useState(false) + const [modalTitle, setModalTitle] = useState('') + const [modalMsg, setModalMsg] = useState('') + const [modalFunc, setModalFunc] = useState(undefined) + const [isDeleted, setIsDeleted] = useState(false) + const [contextData, setContextData] = useState(new Object()) + + const { control, setValue, getValues, handleSubmit, formState: { errors } } = useForm({ + defaultValues: { + defaultPorts: data.length > 0 ? data.map(e => { return { name: e.name, value: e.value, new: false } }) : [{ name: "", value: "", new: true }], + searchPortName: "", + searchPortValue: "" + }, + mode: "all", + resolver: yupResolver(validationSchema), + context: contextData + }) + + const searchPortName = useWatch({ control, name: "searchPortName" }) + const searchPortValue = useWatch({ control, name: "searchPortValue" }) + const defaultPorts = useWatch({ control, name: "defaultPorts" }) + + const { fields, insert, remove } = useFieldArray({ control, name: "defaultPorts" }) + + useEffect(() => { + setValue("defaultPorts", data.length > 0 ? data.map(e => { return { name: e.name, value: e.value } }) : [{ name: "", value: "", new: true }]) + }, [data]) + + useEffect(() => { + setContextData(defaultPorts) + }, [defaultPorts]) + + const toggleModal = () => { + setAreYouSureModal(!areYouSureModal) + } + + const onSubmit = () => { + setModalMsg("Are you sure you want to change default ports?") + setModalTitle("Change default ports") + setModalFunc(() => doSave) + setAreYouSureModal(!areYouSureModal) + } + + const doSave = () => { + let sendData = { + ports: [...getValues("defaultPorts").map((e) => new Object({ name: e.name, value: e.value }))] + } + + mutation.mutate(sendData, { + onSuccess: () => { + queryClient.invalidateQueries("defaultports") + NotifyOk({ + msg: "Default ports successfully changed", + title: "Changed" + }) + }, + onError: (error) => { + NotifyError({ + msg: error?.message ? error.message : "Error changing default ports", + title: "Error" + }) + } + }) + + setIsDeleted(false) + } + + let fieldsView = fields + + if (searchPortName) + fieldsView = fields.filter(e => e.name.toLowerCase().includes(searchPortName.toLowerCase())) + + if (searchPortValue) + fieldsView = fields.filter(e => e.value.toLowerCase().includes(searchPortValue.toLowerCase())) + + return ( + <> + +
+

Default ports

+ + + +
+
+
+ { + (isDeleted) && + +
+   + Some ports have been deleted. To store the change to the DB, click on the save button. +
+
+ } + + + + + + + + + + + + + + + + + + + { + fieldsView.map((entry, index) => + + + + + + + ) + } + +
#Port namePort valueAction
+ + + + + } + /> + + + + } + /> +
+ { index + 1 } + + { + entry.new ? + + + + } + /> + + + { message } + + } + /> + + : + { entry.name } + } + + { + entry.new ? + + + + } + /> + + + { message } + + } + /> + + : + { entry.value } + } + + + +
+ +
+
+
+ + ) +} + + +export const DefaultPortsList = () => { + const backend = new Backend(); + + const { data: defaultPorts, error, status } = useQuery( + "defaultports", async () => { + return await backend.fetchData("/api/v2/internal/default_ports") + } + ) + + if (status === 'loading') + return (); + + else if (status === 'error') + return (); + + else if (defaultPorts) { + return ( + + ) + } + else + return null +} diff --git a/poem/Poem/frontend/react/GroupElements.js b/poem/Poem/frontend/react/GroupElements.js index cadc9996d..8f35d19b4 100644 --- a/poem/Poem/frontend/react/GroupElements.js +++ b/poem/Poem/frontend/react/GroupElements.js @@ -12,20 +12,33 @@ import { BaseArgoTable } from './UIElements'; import { Link } from 'react-router-dom'; -import { Formik, Form, Field } from 'formik'; import { FormGroup, Row, Col, Button, InputGroup, - InputGroupText + InputGroupText, + Input, + Form, + FormFeedback } from 'reactstrap'; import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; import { faTimes, faSearch } from '@fortawesome/free-solid-svg-icons'; import Select from 'react-select'; import { fetchUserGroups } from './QueryFunctions'; import { useQuery, useMutation, useQueryClient } from 'react-query'; +import { Controller, useForm, useWatch } from 'react-hook-form'; +import { ErrorMessage } from '@hookform/error-message'; +import { yupResolver } from '@hookform/resolvers/yup'; +import * as Yup from "yup"; + + +const validationSchema = Yup.object().shape({ + name: Yup.string() + .required("This field is required") + .matches(/^[a-zA-Z][A-Za-z0-9\-_]*$/, "Name can contain alphanumeric characters, dash and underscore, but must always begin with a letter") +}) export const GroupList = (props) => { @@ -82,24 +95,7 @@ export const GroupList = (props) => { }; -export const GroupChange = (props) => { - const [searchItem, setSearchItem] = useState(''); - const [newItems, setNewItems] = useState([]); - const [areYouSureModal, setAreYouSureModal] = useState(undefined); - const [modalTitle, setModalTitle] = useState(undefined); - const [modalMsg, setModalMsg] = useState(undefined); - const [modalFlag, setModalFlag] = useState(undefined); - const [formValues, setFormValues] = useState(undefined); - - const groupname = props.match.params.name; - const group = props.group; - const id = props.id; - const title = props.title; - const addview = props.addview; - - const location = props.location; - const history = props.history; - +const GroupChangeForm = ({ id, items, freeItems, group, groupname, title, addview, location, history }) => { const queryClient = useQueryClient(); const backend = new Backend(); @@ -108,39 +104,46 @@ export const GroupChange = (props) => { const addMutation = useMutation(async (values) => await backend.addObject(`/api/v2/internal/${group}group/`, values)); const deleteMutation = useMutation(async () => await backend.deleteObject(`/api/v2/internal/${group}group/${groupname}`)); - const { data: items, error: errorItems, isLoading: loadingItems } = useQuery( - [`${group}group`, groupname], async () => { - return await backend.fetchResult(`/api/v2/internal/${group}group/${groupname}`); + const [newItems, setNewItems] = useState([]); + const [areYouSureModal, setAreYouSureModal] = useState(undefined); + const [modalTitle, setModalTitle] = useState(undefined); + const [modalMsg, setModalMsg] = useState(undefined); + const [modalFlag, setModalFlag] = useState(undefined); + + const { control, getValues, setValue, handleSubmit, formState: { errors } } = useForm({ + defaultValues: { + name: !addview ? groupname : "", + items: items ? items : [], + freeItems: freeItems.map(itm => new Object({ value: itm, label: itm })), + searchItems: "" }, - { enabled: !addview } - ) + resolver: yupResolver(validationSchema) + }) - const { data: freeItems, error: errorFreeItems, isLoading: loadingFreeItems } = useQuery( - `${group}group`, async () => { - return await backend.fetchResult(`/api/v2/internal/${group}group`); - } - ) + const searchItems = useWatch({ control, name: "searchItems" }) + const watchedItems = useWatch({ control, name: "items" }) function toggleAreYouSure() { setAreYouSureModal(!areYouSureModal); } - function onSubmitHandle(values) { - setModalMsg(`Are you sure you want to ${addview ? 'add' : 'change'} group of ${title}?`); - setModalTitle(`${addview ? 'Add' : 'Change'} group of ${title}`); - setModalFlag('submit'); - setFormValues(values); - toggleAreYouSure(); + function onSubmitHandle() { + setModalMsg(`Are you sure you want to ${addview ? 'add' : 'change'} group of ${title}?`) + setModalTitle(`${addview ? 'Add' : 'Change'} group of ${title}`) + setModalFlag('submit') + toggleAreYouSure() } function onDeleteHandle() { - setModalMsg(`Are you sure you want to delete group of ${title}?`); - setModalTitle(`Delete group of ${title}`); - setModalFlag('delete'); - toggleAreYouSure(); + setModalMsg(`Are you sure you want to delete group of ${title}?`) + setModalTitle(`Delete group of ${title}`) + setModalFlag('delete') + toggleAreYouSure() } function doChange() { + let formValues = getValues() + const sendValues = new Object({ name: formValues.name, items: formValues.items @@ -207,6 +210,224 @@ export const GroupChange = (props) => { }) } + return ( + +
+ + + + + Name + + + } + /> + + + { message } + + } + /> + + + + + + + + + `No available ${title}`} - isMulti - onChange={e => setNewItems(e)} - openMenuOnClick={true} - value={newItems} - options={props.values.freeItems} - /> - - - - - - - - - - - - - - - - - - - - { - props.values.items.filter( - filteredRow => filteredRow.toLowerCase().includes(searchItem.toLowerCase()) - ).map((item, index) => - - - - - - - - ) - } - -
# {`${title.charAt(0).toUpperCase() + title.slice(1)} in group`}Remove
- - - setSearchItem(e.target.value)} - component={SearchField} - /> - {''}
- {index + 1} - {item} - -
-
- { -
- { - !addview ? - - : -
- } - -
- } -
- )} - -
- ); + + ) } else return null; -}; +} diff --git a/poem/Poem/frontend/react/MetricProfiles.js b/poem/Poem/frontend/react/MetricProfiles.js index 4002769a7..a2746ebf5 100644 --- a/poem/Poem/frontend/react/MetricProfiles.js +++ b/poem/Poem/frontend/react/MetricProfiles.js @@ -146,7 +146,7 @@ const MetricProfileTupleValidate = ({view_services, name, groupname, } else if (i.service && (i.isNew || i.serviceChanged) && - services_all.indexOf(i.service) == -1) { + services_all.map(service => service.name).indexOf(i.service) == -1) { obj.service = "Must be one of predefined service types" empty = true } @@ -224,7 +224,7 @@ const ServicesList = () => { service.name)} service={service} index={index} icon='serviceflavour' @@ -312,13 +312,6 @@ const fetchMetricProfile = async (webapi, apiid) => { } -const fetchServiceFlavours = async () => { - const backend = new Backend(); - - return await backend.fetchListOfNames('/api/v2/internal/serviceflavoursall'); -} - - export const MetricProfilesComponent = (props) => { const profile_name = props.match.params.name; const addview = props.addview @@ -330,7 +323,8 @@ export const MetricProfilesComponent = (props) => { const backend = new Backend(); const webapi = new WebApi({ token: props.webapitoken, - metricProfiles: props.webapimetric + metricProfiles: props.webapimetric, + serviceTypes: props.webapiservicetypes }) const queryClient = useQueryClient(); @@ -370,7 +364,8 @@ export const MetricProfilesComponent = (props) => { { enabled: (publicView || !addview), initialData: () => { - return queryClient.getQueryData([`${publicView ? 'public_' : ''}metricprofile`, "backend"])?.find(mpr => mpr.name === profile_name) + if (!addview) + return queryClient.getQueryData([`${publicView ? 'public_' : ''}metricprofile`, "backend"])?.find(mpr => mpr.name === profile_name) } } ) @@ -379,9 +374,10 @@ export const MetricProfilesComponent = (props) => { [`${publicView ? 'public_' : ''}metricprofile`, 'webapi', profile_name], () => fetchMetricProfile(webapi, backendMP.apiid), { - enabled: !!backendMP, + enabled: !!backendMP && !addview, initialData: () => { - return queryClient.getQueryData([`${publicView ? "public_" : ""}metricprofile`, "webapi"])?.find(profile => profile.id == backendMP.apiid) + if (!addview) + return queryClient.getQueryData([`${publicView ? "public_" : ""}metricprofile`, "webapi"])?.find(profile => profile.id == backendMP.apiid) } } ) @@ -391,9 +387,11 @@ export const MetricProfilesComponent = (props) => { { enabled: !publicView } ) - const { data: serviceFlavoursAll, error: errorServiceFlavoursAll, isLoading: loadingServiceFlavoursAll } = useQuery( - 'serviceflavoursall', () => fetchServiceFlavours(), - { enabled: !publicView } + const { data: webApiST, errorWebApiST, isLoading: loadingWebApiST} = useQuery( + ['servicetypes', 'webapi'], async () => { + return await webapi.fetchServiceTypes(); + }, + { enabled: !!userDetails } ) const onInsert = async (element, i, group, name, description) => { @@ -815,7 +813,7 @@ export const MetricProfilesComponent = (props) => { ); } - if (loadingUserDetails || loadingBackendMP || loadingWebApiMP || loadingMetricsAll || loadingServiceFlavoursAll) + if (loadingUserDetails || loadingBackendMP || loadingWebApiMP || loadingMetricsAll || loadingWebApiST) return () else if (errorUserDetails) @@ -830,10 +828,10 @@ export const MetricProfilesComponent = (props) => { else if (errorMetricsAll) return () - else if (errorServiceFlavoursAll) - return () + else if (errorWebApiST) + return () - else if (addview || (backendMP && webApiMP) && (publicView || (metricsAll && serviceFlavoursAll))) + else if ((addview && webApiST) || (backendMP && webApiMP && webApiST) || (publicView)) { let write_perm = undefined @@ -958,7 +956,6 @@ export const MetricProfilesComponent = (props) => { })) item.isNew = true }) - setViewServices(ensureAlignedIndexes(imported).sort(sortServices)); setListServices(ensureAlignedIndexes(imported).sort(sortServices)); } @@ -979,7 +976,7 @@ export const MetricProfilesComponent = (props) => { search_metric: searchMetric, search_serviceflavour: searchServiceFlavour, metrics_all: metricsAll, - services_all: serviceFlavoursAll + services_all: webApiST }} onSubmit = {(values) => onSubmitHandle(values)} enableReinitialize={true} @@ -1008,7 +1005,7 @@ export const MetricProfilesComponent = (props) => { name="view_services" render={props => ( { ); const { data: types, error: typesError, isLoading: typesLoading } = useQuery( - `${publicView ? 'public_' : ''}${type}types`, - () => type === 'metrics' ? fetchMetricTypes(publicView) : fetchMetricTemplateTypes(publicView) - ); + `${publicView ? 'public_' : ''}metrictemplatestypes`, () => fetchMetricTemplateTypes(publicView) + ) const { data: tags, error: tagsError, isLoading: tagsLoading } = useQuery( `${publicView ? 'public_' : ''}metrictags`, () => fetchMetricTags(publicView) diff --git a/poem/Poem/frontend/react/Package.js b/poem/Poem/frontend/react/Package.js index 063f89262..b25f53c38 100644 --- a/poem/Poem/frontend/react/Package.js +++ b/poem/Poem/frontend/react/Package.js @@ -1,4 +1,4 @@ -import React, { useState } from 'react'; +import React, { useEffect, useRef, useState } from 'react'; import { Backend } from './DataManager'; import { Link } from 'react-router-dom'; import{ @@ -12,7 +12,6 @@ import{ DefaultColumnFilter, SelectColumnFilter, BaseArgoTable, - CustomError, DropdownWithFormText } from './UIElements'; import { @@ -23,39 +22,35 @@ import { Button, InputGroup, InputGroupText, - Alert, Input, - Label + Label, + Form, + FormFeedback, + Alert } from 'reactstrap'; -import { Formik, Form, Field } from 'formik'; import { useMutation, useQuery, useQueryClient } from 'react-query'; import { fetchPackages, fetchProbeVersions, fetchYumRepos } from './QueryFunctions'; - - -const packageValidate = (values) => { - const errors = {}; - - if (!values.name) - errors.name = 'Required'; - - else if (!/^\S*$/.test(values.name)) - errors.name = 'Name cannot contain white spaces'; - - if (!values.present_version) { - if (!values.version) - errors.version = 'Required'; - - else if (!/^\S*$/.test(values.version)) - errors.version = 'Version cannot contain white spaces' - } - - if (!values.repo_6 && !values.repo_7) { - errors.repo_6 = 'You must provide at least one repo!'; - errors.repo_7 = 'You must provide at least one repo!'; - } - - return errors; -} +import { Controller, useForm, useWatch } from 'react-hook-form'; +import { ErrorMessage } from '@hookform/error-message'; +import { yupResolver } from '@hookform/resolvers/yup'; +import * as Yup from "yup"; + + +const validationSchema = Yup.object().shape({ + name: Yup.string() + .required("This field is required") + .matches(/^\S+$/, "Name cannot contain white spaces"), + presentVersion: Yup.boolean(), + version: Yup.string() + .required("This field is required") + .matches(/^\S+$/, "Version cannot contain white spaces"), + repo_6: Yup.string().test("repo", "You must provide at least one repo", function () { + return this.parent.repo_6 === "" && this.parent.repo_7 === "" ? false : true + }), + repo_7: Yup.string().test("repo", "You must provide at least one repo", function () { + return this.parent.repo_6 === "" && this.parent.repo_7 === "" ? false : true + }) +}) export const PackageList = (props) => { @@ -137,201 +132,142 @@ export const PackageList = (props) => { ); } else return null; -}; +} -export const PackageComponent = (props) => { - const nameversion = props.match.params.nameversion; - const addview = props.addview; - const cloneview = props.cloneview; - const disabled = props.disabled; - const location = props.location; - const history = props.history; +function splitRepos(repos) { + let repo6 = ''; + let repo7 = ''; + for (let i = 0; i < repos.length; i++) { + if (repos[i].split('(')[1].slice(0, -1) === 'CentOS 6') + repo6 = repos[i]; - const backend = new Backend(); - const queryClient = useQueryClient(); + if (repos[i].split('(')[1].slice(0, -1) === 'CentOS 7') + repo7 = repos[i]; + } - const [repos6, setRepos6] = useState(new Array()) - const [repos7, setRepos7] = useState(new Array()) - const [probes, setProbes] = useState(new Array()) - const [presentVersion, setPresentVersion] = useState(false) + return [repo6, repo7]; +} - const changePackage = useMutation( async (values) => await backend.changeObject('/api/v2/internal/packages/', values) ); - const addPackage = useMutation( async (values) => await backend.addObject('/api/v2/internal/packages/', values) ); - const deletePackage = useMutation( async () => await backend.deleteObject(`/api/v2/internal/packages/${nameversion}`)); - const updateMetricsMutation = useMutation( async (values) => await backend.changeObject('/api/v2/internal/updatemetricsversions/', values) ); - const { data: pkg, error: errorPkg, status: statusPkg } = useQuery( - ['package', nameversion], async () => { - let pkg = await backend.fetchData(`/api/v2/internal/packages/${nameversion}`); - let [repo6, repo7] = splitRepos(pkg.repos); - pkg.initial_version = pkg.version; - pkg.repo_6 = repo6; - pkg.repo_7 = repo7; - return pkg; +const PackageForm = ({ + nameversion, addview, cloneview, disabled, location, history, pkg={}, probes=[], repos6=[], repos7=[], packageVersions=[] +}) => { + const backend = new Backend() + const queryClient = useQueryClient() + + // this ref is used to skip the version validation on the first render (in useEffect) + const didMountRef = useRef(false) + + const { control, getValues, setValue, handleSubmit, trigger, formState: { errors } } = useForm({ + defaultValues: { + id: `${pkg?.id ? pkg.id : ''}`, + name: `${pkg?.name ? pkg.name : ''}`, + version: `${pkg?.version ? pkg.version : ''}`, + initialVersion: `${pkg?.version ? pkg.version : ""}`, + repo_6: `${pkg?.repos ? splitRepos(pkg.repos)[0] : ''}`, + repo_7: `${pkg?.repos ? splitRepos(pkg.repos)[1] : ''}`, + present_version: pkg?.version === "present" }, - { - enabled: !addview, - initialData: () => { - if (!addview) { - let pkgs = queryClient.getQueryData('package'); - if (pkgs) { - let pkg = pkgs.find(pkg => nameversion == `${pkg.name}-${pkg.version}`) - let [repo6, repo7] = splitRepos(pkg.repos); - pkg.initial_version = pkg.version; - pkg.repo_6 = repo6; - pkg.repo_7 = repo7; - return pkg; - } - } - }, - onSuccess: (data) => { - if (data.version === 'present') - setPresentVersion(true) - } - } - ); + resolver: yupResolver(validationSchema), + mode: "all" + }) - const { data: repos, error: errorRepos, status: statusRepos } = useQuery( - 'yumrepo', () => fetchYumRepos(), - { - onSuccess: (data) => { - let listRepos6 = [] - let listRepos7 = [] + const presentVersion = useWatch({ control, name: "present_version" }) - data.forEach(repo => { - if (repo.tag === 'CentOS 6') - listRepos6.push(`${repo.name} (${repo.tag})`) + useEffect(() => { + if (presentVersion) + setValue("version", "present") - else if (repo.tag === 'CentOS 7') - listRepos7.push(`${repo.name} (${repo.tag})`) - }) + if (didMountRef.current) + trigger("version") - setRepos6(listRepos6) - setRepos7(listRepos7) - } - } - ); - - const { error: errorProbes, status: statusProbes } = useQuery( - ['probe', 'version'], () => fetchProbeVersions(), - { - enabled: !!pkg, - onSuccess: (data) => { - let listProbes = new Array() - if (pkg) { - data.forEach(probe => { - if (probe.fields.package === `${pkg.name} (${pkg.version})`) - listProbes.push(probe.fields.name) - }) - } - setProbes(listProbes) - } - } - ); - - const { data: packageVersions, error: errorPackageVersions, status: statusPackageVersions } = useQuery( - ['package', 'versions', nameversion], async () => { - let pkg_versions = [] - if (!addview) - pkg_versions = await backend.fetchData(`/api/v2/internal/packageversions/${pkg.name}`); + didMountRef.current = true + }, [presentVersion, setValue]) - return pkg_versions - }, - { enabled: !!pkg } - ); + const changePackage = useMutation( async (values) => await backend.changeObject('/api/v2/internal/packages/', values) ); + const addPackage = useMutation( async (values) => await backend.addObject('/api/v2/internal/packages/', values) ); + const deletePackage = useMutation( async () => await backend.deleteObject(`/api/v2/internal/packages/${nameversion}`)); + const updateMetricsMutation = useMutation( async (values) => await backend.changeObject('/api/v2/internal/updatemetricsversions/', values) ); const [disabledButton, setDisabledButton] = useState(true); const [areYouSureModal, setAreYouSureModal] = useState(false); const [modalFlag, setModalFlag] = useState(undefined); const [modalTitle, setModalTitle] = useState(undefined); const [modalMsg, setModalMsg] = useState(undefined); - const [formValues, setFormValues] = useState(undefined); function toggleAreYouSure() { setAreYouSureModal(!areYouSureModal); } - function splitRepos(repos) { - let repo6 = ''; - let repo7 = ''; - for (let i = 0; i < repos.length; i++) { - if (repos[i].split('(')[1].slice(0, -1) === 'CentOS 6') - repo6 = repos[i]; - - if (repos[i].split('(')[1].slice(0, -1) === 'CentOS 7') - repo7 = repos[i]; - } - - return [repo6, repo7]; - } - - function onVersionSelect(props, value) { - let initial_version = pkg.initial_version; + function onVersionSelect(value) { + let initial_version = getValues("initialVersion") packageVersions.forEach(pkgv => { if (pkgv.version === value) { - let [repo6, repo7] = splitRepos(pkgv.repos); - props.setFieldValue('name', pkgv.name); - props.setFieldValue('version', pkgv.version); - props.setFieldValue('repo_6', repo6); - props.setFieldValue('repo_7', repo7); - props.setFieldValue('present_version', pkgv.use_present_version); - - setDisabledButton(value === initial_version); + let [repo6, repo7] = splitRepos(pkgv.repos) + setValue('name', pkgv.name) + setValue('version', pkgv.version) + setValue('repo_6', repo6) + setValue('repo_7', repo7) + setValue('present_version', pkgv.use_present_version) + + setDisabledButton(value === initial_version) } - }); + }) } - function onSubmitHandle(values) { - let msg = `Are you sure you want to ${addview || cloneview ? 'add' : 'change'} package?`; - let title = `${addview || cloneview ? 'Add' : 'Change'} package`; + function onSubmitHandle() { + let msg = `Are you sure you want to ${addview || cloneview ? 'add' : 'change'} package?` + let title = `${addview || cloneview ? 'Add' : 'Change'} package` - setFormValues(values); - setModalMsg(msg); - setModalTitle(title); - setModalFlag('submit'); - toggleAreYouSure(); + setModalMsg(msg) + setModalTitle(title) + setModalFlag('submit') + toggleAreYouSure() } - async function onTenantSubmitHandle(values) { + async function onTenantSubmitHandle() { + let values = getValues() + try { let json = await backend.fetchData( - `/api/v2/internal/updatemetricsversions/${values.name}-${values.version}`, + `/api/v2/internal/updatemetricsversions/${values.name}-${values.version}` ) - let msgs = []; + let msgs = [] if ('updated' in json) - msgs.push(json['updated']); + msgs.push(json['updated']) if ('deleted' in json) - msgs.push(json['deleted']); + msgs.push(json['deleted']) if ('warning' in json) - msgs.push(json['warning']); + msgs.push(json['warning']) - let title = 'Update metrics'; + let title = 'Update metrics' msgs.push('ARE YOU SURE you want to update metrics?') - setModalMsg(
{msgs.map((msg, i) =>

{msg}

)}
); - setModalTitle(title); - setFormValues(values); - setModalFlag('update'); - toggleAreYouSure(); + setModalMsg(
{msgs.map((msg, i) =>

{msg}

)}
) + setModalTitle(title) + setModalFlag('update') + toggleAreYouSure() } catch(error) { NotifyError({ title: 'Error', msg: error.message }) } } function updateMetrics() { + let formValues = getValues() + const sendValues = new Object({ name: formValues.name, version: formValues.version }) updateMetricsMutation.mutate(sendValues, { onSuccess: async (data) => { - queryClient.invalidateQueries('metric'); - let json = await data.json(); + queryClient.invalidateQueries('metric') + let json = await data.json() if ('updated' in json) NotifyOk({ msg: json.updated, @@ -339,17 +275,19 @@ export const PackageComponent = (props) => { callback: () => history.push('/ui/administration/packages') }); if ('warning' in json) - NotifyWarn({msg: json.warning, title: 'Warning'}); + NotifyWarn({msg: json.warning, title: 'Warning'}) if ('deleted' in json) - NotifyWarn({msg: json.deleted, title: 'Deleted'}); + NotifyWarn({msg: json.deleted, title: 'Deleted'}) }, onError: (error) => NotifyError({ title: 'Error', msg: error.message }) }) } function doChange() { + let formValues = getValues() let repos = []; + if (formValues.repo_6) repos.push(formValues.repo_6); @@ -418,6 +356,316 @@ export const PackageComponent = (props) => { }) } + return ( + +
+ + + + + Name + + + } + /> + + + { message } + + } + /> + + + Package name. + + + + + + + Version + + disabled ? + onVersionSelect(e.value) } + options={ packageVersions.map(ver => ver.version) } + value={ field.value } + /> + : + + } + /> + + + { message } + + } + /> + + + Package version. + + + + + { + !disabled && + + + + { + setValue("present_version", e.target.checked) + }} + checked={ field.value } + /> + } + /> + + + + } + + + + + { + (!disabled && (errors.repo_6 || errors.repo_7)) && + +
+ You must provide at least one repo +
+
+ } + + + + CentOS 6 repo + + disabled ? + + : + { + setValue("repo_6", e ? e.value : '') + trigger() + }} + options={ repos6 } + value={ field.value } + /> + } + /> + + + Package is part of selected CentOS 6 repo. + + + + + + + CentOS 7 repo + + disabled ? + + : + { + setValue("repo_7", e ? e.value : '') + trigger() + }} + options={ repos7 } + value={ field.value } + /> + } + /> + + + Package is part of selected CentOS 7 repo. + + + + { + (!addview && !cloneview && probes.length > 0) && + + + Probes: +
+ { + probes + .map((e, i) => {e}) + .reduce((prev, curr) => [prev, ', ', curr]) + } +
+ +
+ } +
+ { +
+ { + (!addview && !cloneview && !disabled) ? + + : +
+ } + { + disabled ? + + : + + } +
+ } +
+
+ ) +} + + +export const PackageComponent = (props) => { + const nameversion = props.match.params.nameversion; + const addview = props.addview; + const cloneview = props.cloneview; + const disabled = props.disabled; + const location = props.location; + const history = props.history; + + const backend = new Backend(); + const queryClient = useQueryClient(); + + const { data: pkg, error: errorPkg, status: statusPkg } = useQuery( + ['package', nameversion], async () => { + return await backend.fetchData(`/api/v2/internal/packages/${nameversion}`); + }, + { + enabled: !addview, + initialData: () => { + if (!addview) { + let pkgs = queryClient.getQueryData('package'); + if (pkgs) { + return pkgs.find(pkg => nameversion == `${pkg.name}-${pkg.version}`) + } + } + } + } + ) + + const { data: repos, error: errorRepos, status: statusRepos } = useQuery( + 'yumrepo', () => fetchYumRepos() + ) + + const { data: probes, error: errorProbes, status: statusProbes } = useQuery( + ['probe', 'version'], () => fetchProbeVersions(), + { enabled: !!pkg } + ) + + const { data: packageVersions, error: errorPackageVersions, status: statusPackageVersions } = useQuery( + ['package', 'versions', nameversion], async () => { + return await backend.fetchData(`/api/v2/internal/packageversions/${pkg.name}`); + }, + { enabled: !!pkg } + ) + if (statusPkg === 'loading' || statusRepos === 'loading' || statusProbes === 'loading' || statusPackageVersions === 'loading') return (); @@ -433,249 +681,42 @@ export const PackageComponent = (props) => { else if (statusPackageVersions === 'error') return (); - else if (repos) { + else if (repos && (addview || (pkg && probes && packageVersions))) { + let repos6 = new Array() + let repos7 = new Array() + let listProbes = new Array() + + repos.forEach(repo => { + if (repo.tag === 'CentOS 6') + repos6.push(`${repo.name} (${repo.tag})`) + + else if (repo.tag === 'CentOS 7') + repos7.push(`${repo.name} (${repo.tag})`) + }) + + if (probes) { + probes.forEach(probe => { + if (probe.fields.package === `${pkg.name} (${pkg.version})`) + listProbes.push(probe.fields.name) + }) + } + return ( - - onSubmitHandle(values)} - validate={ packageValidate } - enableReinitialize={ true } - > - {props => ( -
- - - - - Name - - - - - Package name. - - - - - - - Version - { - disabled ? - onVersionSelect(props, e.value) } - options={ packageVersions.map(ver => ver.version) } - value={ props.values.version } - /> - : - - } - - - - Package version. - - - - - { - !disabled && - - - props.setFieldValue('present_version', e.target.checked) } - checked={ props.values.present_version } - /> - - - - } - - - - - { - (!disabled && (props.errors.repo_6 || props.errors.repo_7)) && - -
- You must provide at least one repo! -
-
- } - - - - CentOS 6 repo - { - disabled ? - - : - props.setFieldValue('repo_6', e ? e.value : '') } - options={ repos6 } - value={ props.values.repo_6 } - /> - } - - - Package is part of selected CentOS 6 repo. - - - - - - - CentOS 7 repo - { - disabled ? - - : - props.setFieldValue('repo_7', e ? e.value : '') } - options={ repos7 } - value={ props.values.repo_7 } - /> - } - - - Package is part of selected CentOS 7 repo. - - - - { - (!addview && !cloneview && probes.length > 0) && - - - Probes: -
- { - probes - .map((e, i) => {e}) - .reduce((prev, curr) => [prev, ', ', curr]) - } -
- -
- } -
- { -
- { - (!addview && !cloneview && !disabled) ? - - : -
- } - { - disabled ? - - : - - } -
- } -
- )} -
-
+ disabled={disabled} + location={location} + history={history} + /> ) + } else - return null; -}; + return null +} diff --git a/poem/Poem/frontend/react/QueryFunctions.js b/poem/Poem/frontend/react/QueryFunctions.js index c867c2423..ae91ded27 100644 --- a/poem/Poem/frontend/react/QueryFunctions.js +++ b/poem/Poem/frontend/react/QueryFunctions.js @@ -39,13 +39,6 @@ export const fetchMetrics = async (publicView) => { } -export const fetchMetricTypes = async (publicView) => { - const backend = new Backend(); - - return await backend.fetchData(`/api/v2/internal/${publicView ? 'public_' : ''}mtypes`); -} - - export const fetchAllMetrics = async (publicView) => { const backend = new Backend(); return await backend.fetchListOfNames(`/api/v2/internal/${publicView ? 'public_' : ''}metricsall`); @@ -195,3 +188,8 @@ export const fetchTopologyGroups = async (webapi) => { export const fetchTopologyEndpoints = async (webapi) => { return await webapi.fetchReportsTopologyEndpoints() } + + +export const fetchServiceTypes = async (webapi) => { + return await webapi.fetchServiceTypes() +} diff --git a/poem/Poem/frontend/react/ServiceTypes.js b/poem/Poem/frontend/react/ServiceTypes.js index d68f49d7a..fe0768862 100644 --- a/poem/Poem/frontend/react/ServiceTypes.js +++ b/poem/Poem/frontend/react/ServiceTypes.js @@ -1,25 +1,759 @@ -import React from 'react'; -import { useQuery } from 'react-query'; -import { Backend } from './DataManager'; +import React, { useState, useEffect, useRef } from 'react'; +import { Link } from 'react-router-dom'; +import { WebApi } from './DataManager'; +import { useQuery, useQueryClient, useMutation } from 'react-query'; import { - LoadingAnim, + Button, + Col, + Form, + FormFeedback, + Input, + InputGroup, + Label, + Row, + Table, + PaginationItem, + PaginationLink, + Pagination, +} from 'reactstrap'; +import { + BaseArgoTable, BaseArgoView, + DefaultColumnFilter, ErrorComponent, Icon, - BaseArgoTable, - DefaultColumnFilter + LoadingAnim, + ModalAreYouSure, + NotifyError, + NotifyOk, + ParagraphTitle, + SearchField, } from './UIElements'; +import { + fetchUserDetails, +} from './QueryFunctions'; +import { + faSearch, + faTimes, +} from '@fortawesome/free-solid-svg-icons'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { useForm, Controller, useFieldArray, useWatch } from "react-hook-form"; +import { yupResolver } from '@hookform/resolvers/yup'; +import { ErrorMessage } from '@hookform/error-message'; +import * as yup from "yup"; +import _ from "lodash"; -export const ServiceTypesList = (props) => { - const publicView = props.publicView; +const validationSchema = yup.object().shape({ + name: yup.string().matches(/^[A-Za-z0-9\\.\-_]+$/g, {message: 'Name can only contain alphanumeric characters, punctuations, underscores and minuses', excludeEmptyString: false}), + description: yup.string().required('Description can not be empty.') +}).required(); - const backend = new Backend(); - const { data: serviceTypesDescriptions, error, status } = useQuery( - `${publicView ? 'public_' : ''}servicetypedesc`, async () => { - return await backend.fetchData(`/api/v2/internal/${publicView ? 'public_' : ''}servicetypesdesc`); +const ServiceTypesListAdded = ({data, setCallback, webapi, userDetails, + serviceTypesDescriptions, ...modal}) => { + const { control, setValue, getValues, reset } = useForm({ + defaultValues: { + serviceTypes: data, } + }) + + const queryClient = useQueryClient(); + const webapiAddMutation = useMutation(async (values) => await webapi.addServiceTypes(values)); + + const { fields, remove } = useFieldArray({ + control, + name: "serviceTypes" + }) + + useEffect(() => { + setValue("serviceTypes", data) + }, [data]) + + const {setModalMsg, setModalTitle, + setModalFunc, setAreYouSureModal, + areYouSureModal} = modal + + const resetFields = () => { + reset({ + serviceTypes: Array() + }) + } + + const postServiceTypesWebApi = (data, action, title) => { + webapiAddMutation.mutate(data, { + onSuccess: (retdata) => { + queryClient.setQueryData(['servicetypes', 'webapi'], retdata); + queryClient.setQueryData(['public_servicetypes', 'webapi'], retdata); + NotifyOk({ + msg: 'Service types successfully ' + action, + title: title, + callback: () => resetFields() + }); + }, + onError: (error) => { + NotifyError({ + title: 'Web API error', + msg: error.message ? error.message : 'Web API error adding service types' + }) + } + }) + } + + const doSave = () => { + let tmpArray = [...getValues('serviceTypes'), ...serviceTypesDescriptions] + let pairs = _.orderBy(tmpArray, ['name'], ['asc']) + postServiceTypesWebApi([...pairs.map( + e => Object( + { + 'name': e.name, 'description': e.description + } + ))], + 'added', 'Add') + } + + const onSubmit = (event) => { + event.preventDefault() + setModalMsg(`Are you sure you want to add ${data.length} Service types?`) + setModalTitle('Add service types') + setModalFunc(() => doSave) + setAreYouSureModal(!areYouSureModal); + } + + if (userDetails?.is_superuser && serviceTypesDescriptions) { + return ( +
+ + + + + + + + + + + { + fields.length === 0 ? + + + + + + : + + { + fields.map((entry, index) => + + + +
+ # + + Name of service + + Description of service + + Action +
+ Empty data +
+ {index + 1} + + { entry.name } + + +