Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gerrit Configuration Examples #547

Closed
loganknecht opened this issue Oct 17, 2022 · 28 comments
Closed

Gerrit Configuration Examples #547

loganknecht opened this issue Oct 17, 2022 · 28 comments

Comments

@loganknecht
Copy link

Hello!

This is such a cool project, and I've been having a great time testing it out to figure out how my teams are working together.

Question

I see here that there is a Gerrit dashboard here

This leads me to believe that Grimoire Lab supports Gerrit with a specific index.

However I cannot for the life of me figure out how to configure it.

I have created an SSH key, and I have uploaded it to Gerrit, but I cannot figure out how to configure my projects.json and setup.cfg.

Are there any examples on how to do this? I feel bad for asking, but the documentation for this has been a bit difficult for me to find.

Gratitude

Thank you so much for this great tool! It's solving such a big problem and I really appreciate the open-source approach and architecture decisions that have been made!

@zhquan
Copy link
Member

zhquan commented Oct 17, 2022

Hi @loganknecht

Thank you very much for using it.

There is a gerrit example at https://github.com/chaoss/grimoirelab-sirmordred#gerrit-

I hope it helps you.

@loganknecht
Copy link
Author

@zhquan Thank you so much for the link.

When I use it I'm still not getting the Gerrit dashboard to work.

The first piece is that I'm not seeing the Gerrit dashboard displayed as an option in the dashboards menu
image

And I'm not seeing it displayed in the Visualize menu either
image

I'm also confused about why I need to add my SSH key, but in the configurations I don't see it used anywhere.

My projects.json looks like this

{
    "lknecht": {
        "meta": {
            "title": "LKnecht Test"
        },
        "gerrit": [
            "https://[REDACTED].com/[REDACTED]",
            "https://[REDACTED].com/[REDACTED]",
            "https://[REDACTED].com/[REDACTED]",
            "https://[REDACTED].com/[REDACTED]",
            "https://[REDACTED].com/[REDACTED]e"
        ],
    }
}

My setup.cfg looks like this

[general]
short_name = goSono
update = true
min_update_delay = 60
debug = false
logs_dir = /home/bitergia/logs
aliases_file = /home/bitergia/conf/aliases.json

[projects]
projects_file = /home/bitergia/conf/projects.json

[es_collection]
url = http://elasticsearch:9200

[es_enrichment]
url = http://elasticsearch:9200
autorefresh = true

[sortinghat]
host = mariadb
user = root
password =
database = demo_sh
load_orgs = true
orgs_file = /home/bitergia/conf/organizations.json
autoprofile = [github, pipermail, git]
matching = [email]
sleep_for = 100
unaffiliated_group = Unknown
affiliate = true
strict_mapping = false
reset_on_load = false
identities_file = [/home/bitergia/conf/identities.yml]
identities_format = grimoirelab

[panels]
kibiter_time_from = now-5y
kibiter_default_index = gitlab
kibiter_url = http://kibiter:5601
kibiter_version = 6.1.4-1
gitlab-issues = true
gitlab-merges = true

[phases]
collection = true
identities = true
enrichment = true
panels = true

[git]
raw_index = git_demo_raw
enriched_index = git_demo_enriched
latest-items = true
studies = [enrich_demography:git, enrich_areas_of_code:git, enrich_onion:git]


[enrich_demography:git]

[enrich_areas_of_code:git]
in_index = git_demo_raw
out_index = git-aoc_demo_enriched

[enrich_onion:git]
in_index = git-aoc_demo_enriched
out_index = git-onion_demo_enriched
contribs_field = hash

# ------------------------------------------------------------------------------
# Gerrit
# ------------------------------------------------------------------------------
[gerrit]
raw_index = gerrit_raw
enriched_index = gerrit_enriched
user = lknecht
no-archive = true
blacklist-ids = []
max-reviews = 500
studies = [enrich_demography:gerrit, enrich_onion:gerrit, enrich_demography_contribution:gerrit]

And I run my test server like this

docker run -p 127.0.0.1:5601:5601 \
           -v ~/Desktop/test/gerrit_test/config/projects.json:/projects.json \
           -v ~/Desktop/test/gerrit_test/config/setup.cfg:/setup.cfg \
           -t grimoirelab/full

Is there a specific configuration that I'm doing incorrectly?

@zhquan
Copy link
Member

zhquan commented Oct 17, 2022

-t grimoirelab/full

It's better if you use docker-compose instead of grimoirelab/full due to that image is not updated.

I'm also confused about why I need to add my SSH key, but in the configurations I don't see it used anywhere.

You need to pass your .ssh directory to the Mordred container because Mordred needs to connect the Gerrit instance using ssh to fetch data.

You can remove the study section on gerrit or maybe you did not paste the entire setup.cfg.

Remember that every time you modify your setup.cfg you have to restart the Mordred container

@loganknecht
Copy link
Author

@zhquan

RE: Docker Compose

I would love to use docker-compose, however I can't because I'm on a M1 macbook

I have commented on a separate issue regarding this
#481

Do you have a solution or work around to that? I would love to use that approach.

@zhquan
Copy link
Member

zhquan commented Oct 17, 2022

Try to use this grimoirelab/grimoirelab:latest image and add -v ~/.ssh/:/home/bitergia/.ssh \

@loganknecht
Copy link
Author

loganknecht commented Oct 17, 2022

Hey @zhquan,

Thank you for all the guidance.

This does not seem to work

docker run -p 127.0.0.1:5601:5601 \
           -v ~/Desktop/test/gerrit_test/config/projects.json:/projects.json \
           -v ~/Desktop/test/gerrit_test/config/setup.cfg:/setup.cfg \
           -v ~/.ssh/:/home/bitergia/.ssh \
           -t grimoirelab/grimoirelab:latest

Unable to find image 'grimoirelab/grimoirelab:latest' locally
latest: Pulling from grimoirelab/grimoirelab
bd159e379b3b: Pull complete
de08aeb7fd50: Pull complete
ad171690c8d4: Pull complete
6117759e862e: Pull complete
d3e8b18387e2: Pull complete
276f0257cf82: Pull complete
0043a3d01313: Pull complete
707724857f5b: Pull complete
0613121680e4: Pull complete
f5a9abf92dc4: Pull complete
2b1492899e7a: Pull complete
1f75a70c8173: Pull complete
395cf2b94728: Pull complete
60342ea608e1: Pull complete
ac1d45dcbf8b: Pull complete
be39fe1daf68: Pull complete
4f4fb700ef54: Pull complete
6d3857de2060: Pull complete
7df2a5c22b61: Pull complete
Digest: sha256:06407cc703d8508385a83f3b592f83464d3828d762c6b21c83309229fcbe3179
Status: Downloaded newer image for grimoirelab/grimoirelab:latest
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
Traceback (most recent call last):
  File "/usr/local/bin/sirmordred", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.8/site-packages/sirmordred/bin/sirmordred.py", line 63, in main
    logs_dir = config_dict['general']['logs_dir']
KeyError: 'logs_dir'

I believe this to be the same issue I'm having with Docker Compose as I'm using a Mac M1 chip

@zhquan
Copy link
Member

zhquan commented Oct 18, 2022

Make sure that on your setup.cfg there is a logs_dir on the general section like:

[general]
logs_dir = <path>

@jjmerchante
Copy link
Contributor

Hi @loganknecht,

We recently updated the image of grimoirelab/grimoirelab and its configuration files and maybe you are mixing an old setup with a new container. Specifically, we changed the user bitergia to grimoire.

If you don't want to use docker-compose, I recommend cloning the repository and following the steps at https://github.com/chaoss/grimoirelab/tree/master/docker#run-the-image .

If you are running only the grimoirelab/grimoirelab image without docker-compose, you will need Elasticsearch, Kibiter, and a MySQL database running in their standard ports. From docker-compose:

MariaDB

docker run --net=host --name mariadb \
	-e "MYSQL_ROOT_PASSWORD=" -e "MYSQL_ALLOW_EMPTY_PASSWORD=yes" \
	-d mariadb:10.6

ElasticSearch

docker run --net=host --name elasticsearch \
	-e "ES_JAVA_OPTS=-Xms2g -Xmx2g" -e "ANONYMOUS_USER=true" \
	docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.6 \
	elasticsearch -Enetwork.bind_host=0.0.0.0 -Ehttp.max_content_length=2000mb

Kibiter

docker run --net=host --name kibiter \
	-e "PROJECT_NAME=Demo" -e "NODE_OPTIONS=--max-old-space-size=1000" \
	-e "ELASTICSEARCH_URL=http://localhost:9200" \
	bitergia/kibiter:community-v6.8.6-3

@loganknecht
Copy link
Author

@zhquan I have given up running this MacOS and spun up an Ubuntu 22.04 Jammy Jellyfish Build

I cloned this repo and tried to run the docker-compose directory, but now I'm getting different errors where 4/5 of the containers aren't even starting

lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab/docker-compose$ docker-compose up
Creating docker-compose_elasticsearch_1 ... done
Creating docker-compose_mariadb_1       ... done
Creating docker-compose_hatstall_1      ... done
Creating docker-compose_mordred_1       ... done
Creating docker-compose_kibiter_1       ... done
Attaching to docker-compose_mariadb_1, docker-compose_elasticsearch_1, docker-compose_hatstall_1, docker-compose_mordred_1, docker-compose_kibiter_1
elasticsearch_1  | exec /usr/local/bin/docker-entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
docker-compose_elasticsearch_1 exited with code 1
hatstall_1       | exec /bin/sh: exec format error
mariadb_1        | 2022-10-21 06:19:24+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.10+maria~ubu2004 started.
mariadb_1        | 2022-10-21 06:19:25+00:00 [Note] [Entrypoint]: Switching to dedicated user 'mysql'
mariadb_1        | 2022-10-21 06:19:25+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.10+maria~ubu2004 started.
mariadb_1        | 2022-10-21 06:19:25+00:00 [Note] [Entrypoint]: Initializing database files
mordred_1        | exec /bin/sh: exec format error
docker-compose_hatstall_1 exited with code 1
docker-compose_mordred_1 exited with code 1
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mariadb_1        | 
mariadb_1        | 
mariadb_1        | PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !
mariadb_1        | To do so, start the server, then issue the following command:
mariadb_1        | 
mariadb_1        | '/usr/bin/mysql_secure_installation'
mariadb_1        | 
mariadb_1        | which will also give you the option of removing the test
mariadb_1        | databases and anonymous user created by default.  This is
mariadb_1        | strongly recommended for production servers.
mariadb_1        | 
mariadb_1        | See the MariaDB Knowledgebase at https://mariadb.com/kb
mariadb_1        | 
mariadb_1        | Please report any problems at https://mariadb.org/jira
mariadb_1        | 
mariadb_1        | The latest information about MariaDB is available at https://mariadb.org/.
mariadb_1        | 
mariadb_1        | Consider joining MariaDB's strong and vibrant community:
mariadb_1        | https://mariadb.org/get-involved/
mariadb_1        | 
mariadb_1        | 2022-10-21 06:19:26+00:00 [Note] [Entrypoint]: Database files initialized
mariadb_1        | 2022-10-21 06:19:26+00:00 [Note] [Entrypoint]: Starting temporary server
mariadb_1        | 2022-10-21 06:19:26+00:00 [Note] [Entrypoint]: Waiting for server startup
mariadb_1        | 2022-10-21  6:19:26 0 [Note] mariadbd (server 10.6.10-MariaDB-1:10.6.10+maria~ubu2004) starting as process 93 ...
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Number of pools: 1
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
mariadb_1        | 2022-10-21  6:19:26 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Using Linux native AIO
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Completed initialization of buffer pool
docker-compose_kibiter_1 exited with code 1
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: 128 rollback segments are active.
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Creating shared tablespace for temporary tables
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mariadb_1        | 2022-10-21  6:19:26 0 [Note] InnoDB: 10.6.10 started; log sequence number 41308; transaction id 14
mariadb_1        | 2022-10-21  6:19:26 0 [Note] Plugin 'FEEDBACK' is disabled.
mariadb_1        | 2022-10-21  6:19:26 0 [Warning] 'user' entry 'root@ff7850c2a092' ignored in --skip-name-resolve mode.
mariadb_1        | 2022-10-21  6:19:26 0 [Warning] 'proxies_priv' entry '@% root@ff7850c2a092' ignored in --skip-name-resolve mode.
mariadb_1        | 2022-10-21  6:19:26 0 [Note] mariadbd: ready for connections.
mariadb_1        | Version: '10.6.10-MariaDB-1:10.6.10+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 0  mariadb.org binary distribution
docker-compose_mordred_1 exited with code 1
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
docker-compose_kibiter_1 exited with code 1
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
mariadb_1        | 2022-10-21 06:19:27+00:00 [Note] [Entrypoint]: Temporary server started.
docker-compose_mordred_1 exited with code 1
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
docker-compose_kibiter_1 exited with code 1
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
mariadb_1        | 2022-10-21 06:19:28+00:00 [Note] [Entrypoint]: Securing system users (equivalent to running mysql_secure_installation)
mariadb_1        | 
mariadb_1        | 2022-10-21 06:19:28+00:00 [Note] [Entrypoint]: Stopping temporary server
mariadb_1        | 2022-10-21  6:19:28 0 [Note] mariadbd (initiated by: unknown): Normal shutdown
mariadb_1        | 2022-10-21  6:19:28 0 [Note] InnoDB: FTS optimize thread exiting.
docker-compose_mordred_1 exited with code 1
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mordred_1        | exec /bin/sh: exec format error
mariadb_1        | 2022-10-21  6:19:28 0 [Note] InnoDB: Starting shutdown...
mariadb_1        | 2022-10-21  6:19:28 0 [Note] InnoDB: Dumping buffer pool(s) to /var/lib/mysql/ib_buffer_pool
mariadb_1        | 2022-10-21  6:19:28 0 [Note] InnoDB: Buffer pool(s) dump completed at 221021  6:19:28
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Removed temporary tablespace data file: "./ibtmp1"
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Shutdown completed; log sequence number 42282; transaction id 15
mariadb_1        | 2022-10-21  6:19:29 0 [Note] mariadbd: Shutdown complete
mariadb_1        | 
mariadb_1        | 2022-10-21 06:19:29+00:00 [Note] [Entrypoint]: Temporary server stopped
mariadb_1        | 
mariadb_1        | 2022-10-21 06:19:29+00:00 [Note] [Entrypoint]: MariaDB init process done. Ready for start up.
mariadb_1        | 
docker-compose_kibiter_1 exited with code 1
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
kibiter_1        | exec /docker_entrypoint.sh: exec format error
mariadb_1        | 2022-10-21  6:19:29 0 [Note] mariadbd (server 10.6.10-MariaDB-1:10.6.10+maria~ubu2004) starting as process 1 ...
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Number of pools: 1
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
mariadb_1        | 2022-10-21  6:19:29 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Using Linux native AIO
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Completed initialization of buffer pool
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: 128 rollback segments are active.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Creating shared tablespace for temporary tables
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: 10.6.10 started; log sequence number 42282; transaction id 14
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
mariadb_1        | 2022-10-21  6:19:29 0 [Note] Plugin 'FEEDBACK' is disabled.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] InnoDB: Buffer pool(s) load completed at 221021  6:19:29
mariadb_1        | 2022-10-21  6:19:29 0 [Warning] You need to use --log-bin to make --expire-logs-days or --binlog-expire-logs-seconds work.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] Server socket created on IP: '0.0.0.0'.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] Server socket created on IP: '::'.
mariadb_1        | 2022-10-21  6:19:29 0 [Note] mariadbd: ready for connections.
mariadb_1        | Version: '10.6.10-MariaDB-1:10.6.10+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 3306  mariadb.org binary distribution
docker-compose_mordred_1 exited with code 1
docker-compose_kibiter_1 exited with code 1

If I docker logs each of the containers this is what I get

lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab$ docker ps -a
CONTAINER ID   IMAGE                                                     COMMAND                  CREATED          STATUS                      PORTS      NAMES
e7cdef1e526d   bitergia/kibiter:community-v6.8.6-3                       "/docker_entrypoint.\u2026"   10 minutes ago   Exited (1) 10 minutes ago              docker-compose_kibiter_1
2843073569fe   grimoirelab/grimoirelab:latest                            "/bin/sh -c ${DEPLOY\u2026"   10 minutes ago   Exited (1) 10 minutes ago              docker-compose_mordred_1
10c406240f53   grimoirelab/hatstall:latest                               "/bin/sh -c ${DEPLOY\u2026"   10 minutes ago   Exited (1) 10 minutes ago              docker-compose_hatstall_1
ff7850c2a092   mariadb:10.6                                              "docker-entrypoint.s\u2026"   10 minutes ago   Up 10 minutes               3306/tcp   docker-compose_mariadb_1
4f42cc041297   docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.6   "/usr/local/bin/dock\u2026"   10 minutes ago   Exited (1) 10 minutes ago              docker-compose_elasticsearch_1
lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab$ docker logs e7cdef1e526d
exec /docker_entrypoint.sh: exec format error
exec /docker_entrypoint.sh: exec format error
exec /docker_entrypoint.sh: exec format error
exec /docker_entrypoint.sh: exec format error
exec /docker_entrypoint.sh: exec format error
exec /docker_entrypoint.sh: exec format error
lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab$ docker logs 2843073569fe
exec /bin/sh: exec format error
exec /bin/sh: exec format error
exec /bin/sh: exec format error
exec /bin/sh: exec format error
exec /bin/sh: exec format error
exec /bin/sh: exec format error
lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab$ docker logs 10c406240f53
exec /bin/sh: exec format error
lknecht@lknecht-ubuntu:~/Desktop/chaoss_test/grimoirelab$ docker logs 4f42cc041297
exec /usr/local/bin/docker-entrypoint.sh: exec format error

Am I doing something incorrect in the Linux run-time?

@sduenas
Copy link
Member

sduenas commented Oct 21, 2022

This might be a problem with the current docker image. It's only generated for AMD and not ARM processors. We'll generate a new version of the image, to check if it fixes the problem.

@loganknecht
Copy link
Author

loganknecht commented Nov 22, 2022

Hello!

I believe I'm really close to the finish line on this.

I am using an emulated runtime of an Ubuntu AMD64 image using QEMU/UTM and it appears to spin up correctly, however I'm still getting an elastic search error saying this.

The QEMU/UTM configuration is

  • 4GB Memory
  • 6 CPU
  • 128GB Storage
elasticsearch_1  | #
elasticsearch_1  | # A fatal error has been detected by the Java Runtime Environment:
elasticsearch_1  | #
elasticsearch_1  | #  SIGSEGV (0xb) at pc=0x00007fd84ec6d074, pid=1, tid=71
elasticsearch_1  | #
elasticsearch_1  | # JRE version: OpenJDK Runtime Environment (13.0.1+9) (build 13.0.1+9)
elasticsearch_1  | # Java VM: OpenJDK 64-Bit Server VM (13.0.1+9, mixed mode, sharing, tiered, compressed oops, concurrent mark sweep gc, linux-amd64)
elasticsearch_1  | # Problematic frame:
elasticsearch_1  | # V  [libjvm.so+0x767074]  frame::frame(long*, long*, long*, unsigned char*)+0xc4
elasticsearch_1  | #
elasticsearch_1  | # Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -- %E" (or dumping to /usr/share/elasticsearch/core.1)
elasticsearch_1  | #
elasticsearch_1  | # An error report file with more information is saved as:
elasticsearch_1  | # logs/hs_err_pid1.log
elasticsearch_1  | Compiled method (c1)  690885 1562       3       sun.nio.fs.UnixPath::compareTo (92 bytes)
elasticsearch_1  |  total in heap  [0x00007fd830fe8410,0x00007fd830fe8db8] = 2472
elasticsearch_1  |  relocation     [0x00007fd830fe8570,0x00007fd830fe85d8] = 104
elasticsearch_1  |  main code      [0x00007fd830fe85e0,0x00007fd830fe8b20] = 1344
elasticsearch_1  |  stub code      [0x00007fd830fe8b20,0x00007fd830fe8bb0] = 144
elasticsearch_1  |  metadata       [0x00007fd830fe8bb0,0x00007fd830fe8bc8] = 24
elasticsearch_1  |  scopes data    [0x00007fd830fe8bc8,0x00007fd830fe8c78] = 176
elasticsearch_1  |  scopes pcs     [0x00007fd830fe8c78,0x00007fd830fe8d78] = 256
elasticsearch_1  |  dependencies   [0x00007fd830fe8d78,0x00007fd830fe8d80] = 8
elasticsearch_1  |  nul chk table  [0x00007fd830fe8d80,0x00007fd830fe8db8] = 56
elasticsearch_1  | Compiled method (c1)  690886 1562       3       sun.nio.fs.UnixPath::compareTo (92 bytes)
elasticsearch_1  |  total in heap  [0x00007fd830fe8410,0x00007fd830fe8db8] = 2472
elasticsearch_1  |  relocation     [0x00007fd830fe8570,0x00007fd830fe85d8] = 104
elasticsearch_1  |  main code      [0x00007fd830fe85e0,0x00007fd830fe8b20] = 1344
elasticsearch_1  |  stub code      [0x00007fd830fe8b20,0x00007fd830fe8bb0] = 144
elasticsearch_1  |  metadata       [0x00007fd830fe8bb0,0x00007fd830fe8bc8] = 24
elasticsearch_1  |  scopes data    [0x00007fd830fe8bc8,0x00007fd830fe8c78] = 176
elasticsearch_1  |  scopes pcs     [0x00007fd830fe8c78,0x00007fd830fe8d78] = 256
elasticsearch_1  |  dependencies   [0x00007fd830fe8d78,0x00007fd830fe8d80] = 8
elasticsearch_1  |  nul chk table  [0x00007fd830fe8d80,0x00007fd830fe8db8] = 56
elasticsearch_1  | #
elasticsearch_1  | # If you would like to submit a bug report, please visit:
elasticsearch_1  | #   https://github.com/AdoptOpenJDK/openjdk-build/issues
elasticsearch_1  | #
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 

I was able to get into the container and view the log for the file it mentioned above hs_err_pid1.log

[root@828f4de484b4 elasticsearch]# cat logs/hs_err_pid1.log 
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007fd84ec6d074, pid=1, tid=71
#
# JRE version: OpenJDK Runtime Environment (13.0.1+9) (build 13.0.1+9)
# Java VM: OpenJDK 64-Bit Server VM (13.0.1+9, mixed mode, sharing, tiered, compressed oops, concurrent mark sweep gc, linux-amd64)
# Problematic frame:
# V  [libjvm.so+0x767074]  frame::frame(long*, long*, long*, unsigned char*)+0xc4
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -- %E" (or dumping to /usr/share/elasticsearch/core.1)
#
# If you would like to submit a bug report, please visit:
#   https://github.com/AdoptOpenJDK/openjdk-build/issues
#

---------------  S U M M A R Y ------------

Command Line: -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.io.tmpdir=/tmp/elasticsearch-5148610102302248278 -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=data -XX:ErrorFile=logs/hs_err_pid%p.log -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m -Djava.locale.providers=COMPAT -XX:UseAVX=2 -Xms2g -Xmx2g -Des.path.home=/usr/share/elasticsearch -Des.path.conf=/usr/share/elasticsearch/config -Des.distribution.flavor=oss -Des.distribution.type=docker org.elasticsearch.bootstrap.Elasticsearch -Enetwork.bind_host=0.0.0.0 -Ehttp.max_content_length=2000mb

Host: QEMU Virtual CPU version 2.5+, 6 cores, 3G, CentOS Linux release 7.7.1908 (Core)
Time: Tue Nov 22 22:41:20 2022 UTC elapsed time: 690 seconds (0d 0h 11m 30s)

---------------  T H R E A D  ---------------

Current thread (0x00007fd848019000):  JavaThread "main" [_thread_in_vm, id=71, stack(0x00007fd8508fc000,0x00007fd8509fd000)]

Stack: [0x00007fd8508fc000,0x00007fd8509fd000],  sp=0x00007fd8509f85d0,  free space=1009k
Native frames: (J=compiled Java code, A=aot compiled Java code, j=interpreted, Vv=VM code, C=native code)
V  [libjvm.so+0x767074]  frame::frame(long*, long*, long*, unsigned char*)+0xc4
V  [libjvm.so+0x7666a5]  frame::sender(RegisterMap*) const+0x155
V  [libjvm.so+0x8bdeb1]  java_lang_Throwable::fill_in_stack_trace(Handle, methodHandle const&, Thread*)+0x5d1
V  [libjvm.so+0x8be303]  java_lang_Throwable::fill_in_stack_trace(Handle, methodHandle const&)+0x63
V  [libjvm.so+0x95defd]  JVM_FillInStackTrace+0x6d
C  [libjava.so+0x15a21]  Java_java_lang_Throwable_fillInStackTrace+0x11
j  java.lang.Throwable.fillInStackTrace(I)Ljava/lang/Throwable;+0 [email protected]
j  java.lang.Throwable.fillInStackTrace()Ljava/lang/Throwable;+16 [email protected]
j  java.lang.Throwable.<init>(Ljava/lang/String;)V+24 [email protected]
j  java.lang.Exception.<init>(Ljava/lang/String;)V+2 [email protected]
j  java.lang.RuntimeException.<init>(Ljava/lang/String;)V+2 [email protected]
j  java.lang.ClassCastException.<init>(Ljava/lang/String;)V+2 [email protected]
v  ~StubRoutines::call_stub
V  [libjvm.so+0x8b1569]  JavaCalls::call_helper(JavaValue*, methodHandle const&, JavaCallArguments*, Thread*)+0x3c9
V  [libjvm.so+0x8afb6d]  JavaCalls::call_special(JavaValue*, Klass*, Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x10d
V  [libjvm.so+0x8b0b46]  JavaCalls::construct_new_instance(InstanceKlass*, Symbol*, JavaCallArguments*, Thread*)+0xa6
V  [libjvm.so+0x757095]  Exceptions::new_exception(Thread*, Symbol*, char const*, Handle, Handle, Handle, Exceptions::ExceptionMsgToUtf8Mode)+0x2e5
V  [libjvm.so+0x757115]  Exceptions::new_exception(Thread*, Symbol*, char const*, Exceptions::ExceptionMsgToUtf8Mode)+0x65
V  [libjvm.so+0xd5ee33]  SharedRuntime::throw_and_post_jvmti_exception(JavaThread*, Symbol*, char const*)+0x13
V  [libjvm.so+0x5517fe]  Runtime1::throw_class_cast_exception(JavaThread*, oopDesc*)+0x6e
v  ~RuntimeStub::throw_class_cast_exception Runtime1 stub
J 1562 c1 sun.nio.fs.UnixPath.compareTo(Ljava/nio/file/Path;)I [email protected] (92 bytes) @ 0x00007fd830fe8a4c [0x00007fd830fe85e0+0x000000000000046c]

[error occurred during error reporting (printing native stack), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84ec6d074]

Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j  java.lang.Throwable.fillInStackTrace(I)Ljava/lang/Throwable;+0 [email protected]
j  java.lang.Throwable.fillInStackTrace()Ljava/lang/Throwable;+16 [email protected]
j  java.lang.Throwable.<init>(Ljava/lang/String;)V+24 [email protected]
j  java.lang.Exception.<init>(Ljava/lang/String;)V+2 [email protected]
j  java.lang.RuntimeException.<init>(Ljava/lang/String;)V+2 [email protected]
j  java.lang.ClassCastException.<init>(Ljava/lang/String;)V+2 [email protected]
v  ~StubRoutines::call_stub
v  ~RuntimeStub::throw_class_cast_exception Runtime1 stub
J 1562 c1 sun.nio.fs.UnixPath.compareTo(Ljava/nio/file/Path;)I [email protected] (92 bytes) @ 0x00007fd830fe8a4c [0x00007fd830fe85e0+0x000000000000046c]

[error occurred during error reporting (printing Java stack), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84ec6d074]


siginfo: si_signo: 11 (SIGSEGV), si_code: 1 (SEGV_MAPERR), si_addr: 0x0000000000000000

Register to memory mapping:

RAX=0x0 is NULL
RBX=0x00007fd8509f8690 is pointing into the stack for thread: 0x00007fd848019000
RCX=0x0000000001893eb8 points into unknown readable memory: 31 00 00 00 00 00 00 00
RDX=0x00007fd8480375e0 points into unknown readable memory: a8 53 a5 4f d8 7f 00 00
RSP=0x00007fd8509f85d0 is pointing into the stack for thread: 0x00007fd848019000
RBP=0x00007fd8509f85e0 is pointing into the stack for thread: 0x00007fd848019000
RSI=0x00007fd8509fa738 is pointing into the stack for thread: 0x00007fd848019000
RDI=0x00007fd8509fa7a0 is pointing into the stack for thread: 0x00007fd848019000
R8 =0x00007fd8509fa7a0 is pointing into the stack for thread: 0x00007fd848019000
R9 =0x0000000000000002 is an unknown value
R10=0x00007fd8307ae96f is at code_begin+879 in an Interpreter codelet
method entry point (kind = native)  [0x00007fd8307ae600, 0x00007fd8307aefc0]  2496 bytes
R11=0x00007fd8509f9d48 is pointing into the stack for thread: 0x00007fd848019000
R12=0x0 is NULL
R13=0x00007fd830fe8a4c is at entry_point+1132 in (nmethod*)0x00007fd830fe8410
R14=0x0 is NULL
R15=0x00007fd830fe8410 is at entry_point+-464 in (nmethod*)0x00007fd830fe8410


Registers:
RAX=0x0000000000000000, RBX=0x00007fd8509f8690, RCX=0x0000000001893eb8, RDX=0x00007fd8480375e0
RSP=0x00007fd8509f85d0, RBP=0x00007fd8509f85e0, RSI=0x00007fd8509fa738, RDI=0x00007fd8509fa7a0
R8 =0x00007fd8509fa7a0, R9 =0x0000000000000002, R10=0x00007fd8307ae96f, R11=0x00007fd8509f9d48
R12=0x0000000000000000, R13=0x00007fd830fe8a4c, R14=0x0000000000000000, R15=0x00007fd830fe8410
RIP=0x00007fd84ec6d074, EFLAGS=0x0000000000000246, CSGSFS=0x002b000000000033, ERR=0x0000000000000004
  TRAPNO=0x000000000000000e

Top of Stack: (sp=0x00007fd8509f85d0)
0x00007fd8509f85d0:   00007fd8509f8690 00007fd8509fa738
0x00007fd8509f85e0:   00007fd8509f8630 00007fd84ec6c6a5
0x00007fd8509f85f0:   00007fd8509f8610 00007fd84eb06d6a
0x00007fd8509f8600:   00007fd8509f8630 0000000000000001 

Instructions: (pc=0x00007fd84ec6d074)
0x00007fd84ec6cf74:   48 8b 40 f8 48 89 47 08 c3 90 66 90 48 8b 07 55
0x00007fd84ec6cf84:   48 89 e5 48 8b 40 f8 5d 48 89 47 08 c3 66 2e 0f
0x00007fd84ec6cf94:   1f 84 00 00 00 00 00 0f 1f 44 00 00 55 48 89 e5
0x00007fd84ec6cfa4:   e8 d7 2a 4d 00 5d 48 8b 00 c3 66 90 55 48 89 e5
0x00007fd84ec6cfb4:   41 54 53 48 89 fb 48 89 37 48 89 57 28 48 89 4f
0x00007fd84ec6cfc4:   20 4c 89 47 08 4c 89 c7 e8 1f 9d e9 ff 48 85 c0
0x00007fd84ec6cfd4:   49 89 c4 48 89 43 10 0f 84 93 00 00 00 48 8b 00
0x00007fd84ec6cfe4:   4c 89 e7 ff 50 60 84 c0 0f 84 7e 00 00 00 48 8b
0x00007fd84ec6cff4:   43 08 49 8b 94 24 80 00 00 00 48 39 d0 74 12 41
0x00007fd84ec6d004:   83 7c 24 08 03 74 45 49 3b 84 24 88 00 00 00 75
0x00007fd84ec6d014:   5b 49 8b 04 24 48 8d 15 80 ff ff ff 48 89 de 4c
0x00007fd84ec6d024:   89 e7 48 8b 80 b8 01 00 00 48 39 d0 75 36 e8 49
0x00007fd84ec6d034:   2a 4d 00 48 8b 00 48 85 c0 74 31 48 89 43 08 c7
0x00007fd84ec6d044:   43 18 01 00 00 00 5b 41 5c 5d c3 90 48 83 c2 05
0x00007fd84ec6d054:   48 39 d0 74 bc 49 3b 84 24 88 00 00 00 74 b2 eb
0x00007fd84ec6d064:   0b 0f 1f 00 ff d0 eb ce 0f 1f 40 00 4c 8b 63 10
0x00007fd84ec6d074:   49 8b 04 24 4c 89 e7 ff 50 20 84 c0 0f 95 c0 0f
0x00007fd84ec6d084:   b6 c0 89 43 18 5b 41 5c 5d c3 66 90 55 48 8b 4f
0x00007fd84ec6d094:   08 48 8d 15 eb 8d 95 00 4c 8d 05 7d ce 94 00 48
0x00007fd84ec6d0a4:   89 f0 48 8d 35 8b ce 94 00 48 89 e5 f6 c1 02 5d
0x00007fd84ec6d0b4:   4c 0f 44 c2 48 8b 57 10 48 83 e1 fc 48 89 c7 31
0x00007fd84ec6d0c4:   c0 e9 66 b4 52 00 66 0f 1f 44 00 00 48 8b 05 29
0x00007fd84ec6d0d4:   dc f0 00 55 c7 47 10 00 00 00 00 48 89 e5 48 89
0x00007fd84ec6d0e4:   47 08 5d c3 0f 1f 84 00 00 00 00 00 48 8b 07 55
0x00007fd84ec6d0f4:   31 f6 48 89 e5 41 54 49 89 fc 53 48 8b 5f 08 ff
0x00007fd84ec6d104:   10 48 39 1d f4 db f0 00 b8 00 00 00 00 49 c7 44
0x00007fd84ec6d114:   24 08 00 00 00 00 48 0f 44 d8 48 89 d8 5b 41 5c
0x00007fd84ec6d124:   5d c3 66 2e 0f 1f 84 00 00 00 00 00 48 8b 05 c9
0x00007fd84ec6d134:   db f0 00 55 c7 47 10 00 00 00 00 48 c7 47 30 00
0x00007fd84ec6d144:   00 00 00 48 89 e5 48 89 47 08 5d c3 55 48 8b 3d
0x00007fd84ec6d154:   b0 db f0 00 0f b6 d2 48 89 e5 5d e9 ac 89 01 00
0x00007fd84ec6d164:   66 90 66 2e 0f 1f 84 00 00 00 00 00 55 48 8b 3d 


Stack slot to memory mapping:
stack at sp + 0 slots: 0x00007fd8509f8690 is pointing into the stack for thread: 0x00007fd848019000
stack at sp + 1 slots: 0x00007fd8509fa738 is pointing into the stack for thread: 0x00007fd848019000
stack at sp + 2 slots: 0x00007fd8509f8630 is pointing into the stack for thread: 0x00007fd848019000
stack at sp + 3 slots: 0x00007fd84ec6c6a5: <offset 0x00000000007666a5> in /opt/jdk-13.0.1+9/lib/server/libjvm.so at 0x00007fd84e506000
stack at sp + 4 slots: 0x00007fd8509f8610 is pointing into the stack for thread: 0x00007fd848019000
stack at sp + 5 slots: 0x00007fd84eb06d6a: <offset 0x0000000000600d6a> in /opt/jdk-13.0.1+9/lib/server/libjvm.so at 0x00007fd84e506000
stack at sp + 6 slots: 0x00007fd8509f8630 is pointing into the stack for thread: 0x00007fd848019000
stack at sp + 7 slots: 0x0000000000000001 is an unknown value


---------------  P R O C E S S  ---------------

Threads class SMR info:
_java_thread_list=0x00007fd8484bede0, length=10, elements={
0x00007fd848019000, 0x00007fd8480c4000, 0x00007fd8480c6800, 0x00007fd8480d9000,
0x00007fd8480db800, 0x00007fd8480dd800, 0x00007fd8480df800, 0x00007fd848122800,
0x00007fd84812b000, 0x00007fd848357800
}

Java Threads: ( => current thread )
=>0x00007fd848019000 JavaThread "main" [_thread_in_vm, id=71, stack(0x00007fd8508fc000,0x00007fd8509fd000)]
  0x00007fd8480c4000 JavaThread "Reference Handler" daemon [_thread_blocked, id=76, stack(0x00007fd8300a3000,0x00007fd8301a4000)]
  0x00007fd8480c6800 JavaThread "Finalizer" daemon [_thread_blocked, id=77, stack(0x00007fd8286cb000,0x00007fd8287cc000)]
  0x00007fd8480d9000 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=78, stack(0x00007fd8285ca000,0x00007fd8286cb000)]
  0x00007fd8480db800 JavaThread "C2 CompilerThread0" daemon [_thread_blocked, id=79, stack(0x00007fd8284c9000,0x00007fd8285ca000)]
  0x00007fd8480dd800 JavaThread "C1 CompilerThread0" daemon [_thread_blocked, id=80, stack(0x00007fd8283c8000,0x00007fd8284c9000)]
  0x00007fd8480df800 JavaThread "Sweeper thread" daemon [_thread_blocked, id=81, stack(0x00007fd8282c7000,0x00007fd8283c8000)]
  0x00007fd848122800 JavaThread "Service Thread" daemon [_thread_blocked, id=82, stack(0x00007fd8281c6000,0x00007fd8282c7000)]
  0x00007fd84812b000 JavaThread "Common-Cleaner" daemon [_thread_blocked, id=84, stack(0x00007fd8236ff000,0x00007fd823800000)]
  0x00007fd848357800 JavaThread "process reaper" daemon [_thread_blocked, id=86, stack(0x00007fd830050000,0x00007fd830072000)]

Other Threads:
  0x00007fd8480c1000 VMThread "VM Thread" [stack: 0x00007fd8301a6000,0x00007fd8302a6000] [id=75] _threads_hazard_ptr=0x00007fd8484bede0
  0x00007fd848124800 WatcherThread [stack: 0x00007fd8280c6000,0x00007fd8281c6000] [id=83]
  0x00007fd848099000 GCTaskThread "GC Thread#0" [stack: 0x00007fd84c5a3000,0x00007fd84c6a3000] [id=72]
  0x00007fd84809f000 ConcurrentGCThread "CMS Main Thread" [stack: 0x00007fd83049d000,0x00007fd83059d000] [id=74]
  0x00007fd84809b800 GCTaskThread "CMS Thread#0" [stack: 0x00007fd84c4a1000,0x00007fd84c5a1000] [id=73]

Threads with active compile tasks:
C2 CompilerThread0   690917 1708   !   4       sun.nio.fs.UnixPath::initOffsets (189 bytes)

VM state:synchronizing (normal execution)

VM Mutex/Monitor currently owned by a thread:  ([mutex/lock_event])
[0x00007fd848014ed0] Threads_lock - owner thread: 0x00007fd8480c1000

Heap address: 0x0000000080000000, size: 2048 MB, Compressed Oops mode: 32-bit
Narrow klass base: 0x0000000800000000, Narrow klass shift: 3
Compressed class space size: 1073741824 Address: 0x0000000800ae7000

Heap:
 par new generation   total 460096K, used 90027K [0x0000000080000000, 0x000000009f330000, 0x000000009f330000)
  eden space 409024K,  22% used [0x0000000080000000, 0x00000000857eaec8, 0x0000000098f70000)
  from space 51072K,   0% used [0x0000000098f70000, 0x0000000098f70000, 0x000000009c150000)
  to   space 51072K,   0% used [0x000000009c150000, 0x000000009c150000, 0x000000009f330000)
 concurrent mark-sweep generation total 1585984K, used 0K [0x000000009f330000, 0x0000000100000000, 0x0000000100000000)
 Metaspace       used 13147K, capacity 13598K, committed 13824K, reserved 1060864K
  class space    used 1581K, capacity 1770K, committed 1792K, reserved 1048576K

Card table byte_map: [0x00007fd84cfb0000,0x00007fd84d3b1000] _byte_map_base: 0x00007fd84cbb0000

Marking Bits: (CMSBitMap*) 0x00007fd84809a7c0
 Bits: [0x00007fd82a7cc000, 0x00007fd82bfff400)

Mod Union Table: (CMSBitMap*) 0x00007fd84809a860
 Bits: [0x00007fd850887000, 0x00007fd8508e7cd0)

Polling page: 0x00007fd850a02000

Metaspace:

Usage:
  Non-class:     11.55 MB capacity,    11.30 MB ( 98%) used,   222.99 KB (  2%) free+waste,    38.75 KB ( <1%) overhead. 
      Class:      1.73 MB capacity,     1.54 MB ( 89%) used,   166.35 KB (  9%) free+waste,    22.00 KB (  1%) overhead. 
       Both:     13.28 MB capacity,    12.84 MB ( 97%) used,   389.34 KB (  3%) free+waste,    60.75 KB ( <1%) overhead. 

Virtual space:
  Non-class space:       12.00 MB reserved,      11.75 MB ( 98%) committed 
      Class space:        1.00 GB reserved,       1.75 MB ( <1%) committed 
             Both:        1.01 GB reserved,      13.50 MB (  1%) committed 

Chunk freelists:
   Non-Class:  0 bytes
       Class:  0 bytes
        Both:  0 bytes

MaxMetaspaceSize: unlimited
CompressedClassSpaceSize: 1.00 GB

CodeHeap 'non-profiled nmethods': size=120032Kb used=695Kb max_used=695Kb free=119336Kb
 bounds [0x00007fd838267000, 0x00007fd8384d7000, 0x00007fd83f79f000]
CodeHeap 'profiled nmethods': size=120032Kb used=3471Kb max_used=3471Kb free=116560Kb
 bounds [0x00007fd830d2f000, 0x00007fd83109f000, 0x00007fd838267000]
CodeHeap 'non-nmethods': size=5696Kb used=1232Kb max_used=1256Kb free=4463Kb
 bounds [0x00007fd83079f000, 0x00007fd830a0f000, 0x00007fd830d2f000]
 total_blobs=2674 nmethods=1733 adapters=423
 compilation: enabled
              stopped_count=0, restarted_count=0
 full_count=0

Compilation events (20 events):
Event: 690.688 Thread 0x00007fd8480dd800 1741       3       jdk.internal.org.objectweb.asm.AnnotationVisitor::<init> (7 bytes)
Event: 690.690 Thread 0x00007fd8480dd800 nmethod 1741 0x00007fd83108be90 code [0x00007fd83108c040, 0x00007fd83108c1e8]
Event: 690.690 Thread 0x00007fd8480dd800 1742       3       jdk.internal.org.objectweb.asm.AnnotationVisitor::<init> (47 bytes)
Event: 690.694 Thread 0x00007fd8480dd800 nmethod 1742 0x00007fd83108c290 code [0x00007fd83108c440, 0x00007fd83108c7e8]
Event: 690.704 Thread 0x00007fd8480dd800 1743       3       java.util.concurrent.atomic.AtomicInteger::incrementAndGet (14 bytes)
Event: 690.707 Thread 0x00007fd8480dd800 nmethod 1743 0x00007fd83108c910 code [0x00007fd83108caa0, 0x00007fd83108cc30]
Event: 690.711 Thread 0x00007fd8480dd800 1744       3       java.io.FilePermission::containsPath (245 bytes)
Event: 690.728 Thread 0x00007fd8480dd800 nmethod 1744 0x00007fd83108cc90 code [0x00007fd83108d160, 0x00007fd83108fca8]
Event: 690.728 Thread 0x00007fd8480dd800 1745       3       java.io.FilePermission::impliesIgnoreMask (462 bytes)
Event: 690.739 Thread 0x00007fd8480dd800 nmethod 1745 0x00007fd831090810 code [0x00007fd831090a80, 0x00007fd831091bf8]
Event: 690.739 Thread 0x00007fd8480dd800 1746       3       java.util.concurrent.ConcurrentHashMap$BaseIterator::<init> (21 bytes)
Event: 690.741 Thread 0x00007fd8480dd800 nmethod 1746 0x00007fd831091e10 code [0x00007fd831091fc0, 0x00007fd8310921d8]
Event: 690.741 Thread 0x00007fd8480dd800 1747       3       java.util.concurrent.ConcurrentHashMap$Traverser::<init> (36 bytes)
Event: 690.743 Thread 0x00007fd8480dd800 nmethod 1747 0x00007fd831092290 code [0x00007fd831092420, 0x00007fd831092630]
Event: 690.758 Thread 0x00007fd8480dd800 1749       3       sun.nio.fs.UnixPath::getByteArrayForSysCalls (48 bytes)
Event: 690.764 Thread 0x00007fd8480db800 nmethod 1702 0x00007fd838310090 code [0x00007fd8383103e0, 0x00007fd838312d50]
Event: 690.764 Thread 0x00007fd8480dd800 nmethod 1749 0x00007fd831092710 code [0x00007fd8310928e0, 0x00007fd831092e28]
Event: 690.764 Thread 0x00007fd8480db800 1748       4       sun.nio.fs.UnixPath::compareTo (92 bytes)
Event: 690.782 Thread 0x00007fd8480db800 nmethod 1748 0x00007fd838314990 code [0x00007fd838314b20, 0x00007fd838314cd8]
Event: 690.782 Thread 0x00007fd8480db800 1708   !   4       sun.nio.fs.UnixPath::initOffsets (189 bytes)

GC Heap History (0 events):
No events

Deoptimization events (20 events):
Event: 689.159 Thread 0x00007fd848019000 Uncommon trap: trap_request=0xffffff45 fr.pc=0x00007fd8382cfbfc relative=0x00000000000003bc
Event: 689.159 Thread 0x00007fd848019000 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007fd8382cfbfc method=java.util.zip.ZipInputStream.read([BII)I @ 38 c2
Event: 689.159 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8382cfbfc sp=0x00007fd8509f9460
Event: 689.159 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e811f sp=0x00007fd8509f93d0 mode 2
Event: 689.160 Thread 0x00007fd848019000 Uncommon trap: trap_request=0xffffff45 fr.pc=0x00007fd8382d4a64 relative=0x0000000000000664
Event: 689.160 Thread 0x00007fd848019000 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007fd8382d4a64 method=java.util.zip.ZipInputStream.read([BII)I @ 38 c2
Event: 689.160 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8382d4a64 sp=0x00007fd8509f9620
Event: 689.160 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e811f sp=0x00007fd8509f9518 mode 2
Event: 689.909 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd830fe2709 sp=0x00007fd8509fb250
Event: 689.909 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509fa708 mode 0
Event: 690.107 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8310630df sp=0x00007fd8509fa000
Event: 690.107 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509f95c0 mode 0
Event: 690.171 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8310630df sp=0x00007fd8509fa040
Event: 690.171 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509f9600 mode 0
Event: 690.213 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8310630df sp=0x00007fd8509fa030
Event: 690.213 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509f95f0 mode 0
Event: 690.253 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd8310630df sp=0x00007fd8509fa040
Event: 690.254 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509f9600 mode 0
Event: 690.376 Thread 0x00007fd848019000 DEOPT PACKING pc=0x00007fd830fe2709 sp=0x00007fd8509fa4a0
Event: 690.377 Thread 0x00007fd848019000 DEOPT UNPACKING pc=0x00007fd8307e862b sp=0x00007fd8509f9958 mode 0

Classes unloaded (0 events):
No events

Classes redefined (0 events):
No events

Internal exceptions (20 events):
Event: 689.933 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084fbfe58}: javax/smartcardio/CardPermission> (0x0000000084fbfe58) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 689.990 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084fe5d90}: javax/smartcardio/CardPermission> (0x0000000084fe5d90) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.007 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff2b28}: org/elasticsearch/script/ClassPermission> (0x0000000084ff2b28) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.008 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff3110}: org/elasticsearch/script/ClassPermission> (0x0000000084ff3110) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.008 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff36f8}: org/elasticsearch/script/ClassPermission> (0x0000000084ff36f8) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.008 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff3ce0}: org/elasticsearch/script/ClassPermission> (0x0000000084ff3ce0) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.009 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff42c8}: org/elasticsearch/script/ClassPermission> (0x0000000084ff42c8) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.010 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000084ff48b0}: org/elasticsearch/script/ClassPermission> (0x0000000084ff48b0) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.053 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x00000000850104c0}: javax/smartcardio/CardPermission> (0x00000000850104c0) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.117 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x000000008503aff8}: javax/smartcardio/CardPermission> (0x000000008503aff8) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.183 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x0000000085074230}: javax/smartcardio/CardPermission> (0x0000000085074230) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.225 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x000000008508bc80}: org/elasticsearch/SpecialPermission> (0x000000008508bc80) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.270 Thread 0x00007fd848019000 Exception <a 'java/lang/ClassNotFoundException'{0x00000000850a6b60}: javax/smartcardio/CardPermission> (0x00000000850a6b60) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/classfile/systemDictionary.cpp, line 232]
Event: 690.482 Thread 0x00007fd848019000 Exception <a 'java/lang/NoSuchMethodError'{0x000000008511e138}: 'java.lang.Object java.lang.invoke.DirectMethodHandle$Holder.invokeStatic(java.lang.Object, long, java.lang.Object)'> (0x000000008511e138) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/interpreter/linkResolver.cpp, line 767]
Event: 690.505 Thread 0x00007fd848019000 Exception <a 'java/lang/NoSuchMethodError'{0x000000008512aa58}: 'java.lang.Object java.lang.invoke.DirectMethodHandle$Holder.invokeStatic(java.lang.Object, long, java.lang.Object, java.lang.Object)'> (0x000000008512aa58) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/interpreter/linkResolver.cpp, line 767]
Event: 690.510 Thread 0x00007fd848019000 Exception <a 'java/lang/NoSuchMethodError'{0x000000008512dec0}: 'java.lang.Object java.lang.invoke.DirectMethodHandle$Holder.invokeSpecial(java.lang.Object, java.lang.Object, long, java.lang.Object, java.lang.Object)'> (0x000000008512dec0) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/interpreter/linkResolver.cpp, line 767]
Event: 690.545 Thread 0x00007fd848019000 Exception <a 'sun/nio/fs/UnixException'{0x0000000085148430}> (0x0000000085148430) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/prims/jni.cpp, line 598]
Event: 690.567 Thread 0x00007fd848019000 Exception <a 'sun/nio/fs/UnixException'{0x0000000085150140}> (0x0000000085150140) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/prims/jni.cpp, line 598]
Event: 690.699 Thread 0x00007fd848019000 Exception <a 'java/lang/NoSuchMethodError'{0x000000008517a958}: 'void java.lang.invoke.DirectMethodHandle$Holder.invokeStaticInit(java.lang.Object, java.lang.Object, java.lang.Object, java.lang.Object, java.lang.Object)'> (0x000000008517a958) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/interpreter/linkResolver.cpp, line 767]
Event: 690.745 Thread 0x00007fd848019000 Exception <a 'sun/nio/fs/UnixException'{0x0000000085198620}> (0x0000000085198620) 
thrown [/home/jenkins/workspace/build-scripts/jobs/jdk13u/jdk13u-linux-x64-hotspot/workspace/build/src/src/hotspot/share/prims/jni.cpp, line 598]

Events (20 events):
Event: 690.575 loading class java/nio/file/Path$1
Event: 690.575 loading class java/nio/file/Path$1 done
Event: 690.624 loading class org/apache/lucene/store/ByteBufferIndexInput
Event: 690.624 loading class org/apache/lucene/store/ByteBufferIndexInput done
Event: 690.655 loading class sun/misc/Unsafe
Event: 690.662 loading class sun/misc/Unsafe done
Event: 690.689 loading class org/apache/lucene/store/ByteBufferGuard$BufferCleaner
Event: 690.689 loading class org/apache/lucene/store/ByteBufferGuard$BufferCleaner done
Event: 690.692 loading class org/apache/lucene/store/ByteBufferGuard$BufferCleaner
Event: 690.692 loading class org/apache/lucene/store/ByteBufferGuard$BufferCleaner done
Event: 690.769 loading class sun/nio/ch/FileLockImpl
Event: 690.769 loading class java/nio/channels/FileLock
Event: 690.770 loading class java/nio/channels/FileLock done
Event: 690.770 loading class sun/nio/ch/FileLockImpl done
Event: 690.771 loading class sun/nio/ch/FileLockTable
Event: 690.772 loading class sun/nio/ch/FileLockTable done
Event: 690.773 loading class sun/nio/ch/FileKey
Event: 690.773 loading class sun/nio/ch/FileKey done
Event: 690.774 loading class sun/nio/ch/FileLockTable$FileLockReference
Event: 690.775 loading class sun/nio/ch/FileLockTable$FileLockReference done


Dynamic libraries:
00400000-00401000 r-xp 00000000 00:4d 3675670                            /opt/jdk-13.0.1+9/bin/java
00600000-00601000 r--p 00000000 00:4d 3675670                            /opt/jdk-13.0.1+9/bin/java
00601000-00602000 rw-p 00001000 00:4d 3675670                            /opt/jdk-13.0.1+9/bin/java
01892000-018b3000 rw-p 00000000 00:00 0                                  [heap]
80000000-100000000 rw-p 00000000 00:00 0 
800000000-800003000 rwxp 00001000 00:4d 4984711                          /opt/jdk-13.0.1+9/lib/server/classes.jsa
800003000-8003d7000 rw-p 00004000 00:4d 4984711                          /opt/jdk-13.0.1+9/lib/server/classes.jsa
8003d7000-800ae6000 r--p 003d8000 00:4d 4984711                          /opt/jdk-13.0.1+9/lib/server/classes.jsa
800ae6000-800ae7000 rw-p 00ae7000 00:4d 4984711                          /opt/jdk-13.0.1+9/lib/server/classes.jsa
800ae7000-800ca7000 rw-p 00000000 00:00 0 
800ca7000-840ae7000 ---p 00000000 00:00 0 
7fd7f0000000-7fd7f0021000 rw-p 00000000 00:00 0 
7fd7f0021000-7fd7f4000000 ---p 00000000 00:00 0 
7fd7f4000000-7fd7f4021000 rw-p 00000000 00:00 0 
7fd7f4021000-7fd7f8000000 ---p 00000000 00:00 0 
7fd7f8000000-7fd7f8021000 rw-p 00000000 00:00 0 
7fd7f8021000-7fd7fc000000 ---p 00000000 00:00 0 
7fd7fc000000-7fd7fc021000 rw-p 00000000 00:00 0 
7fd7fc021000-7fd800000000 ---p 00000000 00:00 0 
7fd800000000-7fd8002b4000 rw-p 00000000 00:00 0 
7fd8002b4000-7fd804000000 ---p 00000000 00:00 0 
7fd804000000-7fd804021000 rw-p 00000000 00:00 0 
7fd804021000-7fd808000000 ---p 00000000 00:00 0 
7fd808000000-7fd808021000 rw-p 00000000 00:00 0 
7fd808021000-7fd80c000000 ---p 00000000 00:00 0 
7fd80c000000-7fd80d88c000 rw-p 00000000 00:00 0 
7fd80d88c000-7fd810000000 ---p 00000000 00:00 0 
7fd810000000-7fd810021000 rw-p 00000000 00:00 0 
7fd810021000-7fd814000000 ---p 00000000 00:00 0 
7fd814000000-7fd814021000 rw-p 00000000 00:00 0 
7fd814021000-7fd818000000 ---p 00000000 00:00 0 
7fd818000000-7fd818021000 rw-p 00000000 00:00 0 
7fd818021000-7fd81c000000 ---p 00000000 00:00 0 
7fd81c000000-7fd81c021000 rw-p 00000000 00:00 0 
7fd81c021000-7fd820000000 ---p 00000000 00:00 0 
7fd82226d000-7fd82270b000 rw-p 00000000 00:00 0 
7fd822784000-7fd822944000 rw-p 00000000 00:00 0 
7fd822944000-7fd822984000 ---p 00000000 00:00 0 
7fd822984000-7fd82299c000 r-xp 00000000 00:4d 8126702                    /tmp/elasticsearch-5148610102302248278/jna--1985354563/jna10462247501568436268.tmp (deleted)
7fd82299c000-7fd822b9c000 ---p 00018000 00:4d 8126702                    /tmp/elasticsearch-5148610102302248278/jna--1985354563/jna10462247501568436268.tmp (deleted)
7fd822b9c000-7fd822b9d000 rw-p 00018000 00:4d 8126702                    /tmp/elasticsearch-5148610102302248278/jna--1985354563/jna10462247501568436268.tmp (deleted)
7fd822b9d000-7fd822ba3000 r-xp 00000000 00:4d 4596304                    /opt/jdk-13.0.1+9/lib/libmanagement_ext.so
7fd822ba3000-7fd822da2000 ---p 00006000 00:4d 4596304                    /opt/jdk-13.0.1+9/lib/libmanagement_ext.so
7fd822da2000-7fd822da3000 r--p 00005000 00:4d 4596304                    /opt/jdk-13.0.1+9/lib/libmanagement_ext.so
7fd822da3000-7fd822da4000 rw-p 00006000 00:4d 4596304                    /opt/jdk-13.0.1+9/lib/libmanagement_ext.so
7fd822e8a000-7fd822e8e000 r-xp 00000000 00:4d 4596302                    /opt/jdk-13.0.1+9/lib/libmanagement.so
7fd822e8e000-7fd82308e000 ---p 00004000 00:4d 4596302                    /opt/jdk-13.0.1+9/lib/libmanagement.so
7fd82308e000-7fd82308f000 r--p 00004000 00:4d 4596302                    /opt/jdk-13.0.1+9/lib/libmanagement.so
7fd82308f000-7fd823090000 rw-p 00005000 00:4d 4596302                    /opt/jdk-13.0.1+9/lib/libmanagement.so
7fd823090000-7fd8230d0000 r-xp 00000000 00:4d 4596313                    /opt/jdk-13.0.1+9/lib/libsunec.so
7fd8230d0000-7fd8232d0000 ---p 00040000 00:4d 4596313                    /opt/jdk-13.0.1+9/lib/libsunec.so
7fd8232d0000-7fd8232d5000 r--p 00040000 00:4d 4596313                    /opt/jdk-13.0.1+9/lib/libsunec.so
7fd8232d5000-7fd8232d7000 rw-p 00045000 00:4d 4596313                    /opt/jdk-13.0.1+9/lib/libsunec.so
7fd8232d7000-7fd8232ec000 r-xp 00000000 00:4d 4596306                    /opt/jdk-13.0.1+9/lib/libnet.so
7fd8232ec000-7fd8234eb000 ---p 00015000 00:4d 4596306                    /opt/jdk-13.0.1+9/lib/libnet.so
7fd8234eb000-7fd8234ec000 r--p 00014000 00:4d 4596306                    /opt/jdk-13.0.1+9/lib/libnet.so
7fd8234ec000-7fd8234ed000 rw-p 00015000 00:4d 4596306                    /opt/jdk-13.0.1+9/lib/libnet.so
7fd8234ed000-7fd8234fe000 r-xp 00000000 00:4d 4596307                    /opt/jdk-13.0.1+9/lib/libnio.so
7fd8234fe000-7fd8236fd000 ---p 00011000 00:4d 4596307                    /opt/jdk-13.0.1+9/lib/libnio.so
7fd8236fd000-7fd8236fe000 r--p 00010000 00:4d 4596307                    /opt/jdk-13.0.1+9/lib/libnio.so
7fd8236fe000-7fd8236ff000 rw-p 00011000 00:4d 4596307                    /opt/jdk-13.0.1+9/lib/libnio.so
7fd8236ff000-7fd823703000 ---p 00000000 00:00 0 
7fd823703000-7fd824000000 rw-p 00000000 00:00 0 
7fd824000000-7fd824021000 rw-p 00000000 00:00 0 
7fd824021000-7fd828000000 ---p 00000000 00:00 0 
7fd8280c4000-7fd8280c5000 ---p 00000000 00:00 0 
7fd8280c5000-7fd8281c6000 rw-p 00000000 00:00 0 
7fd8281c6000-7fd8281ca000 ---p 00000000 00:00 0 
7fd8281ca000-7fd8282c7000 rw-p 00000000 00:00 0 
7fd8282c7000-7fd8282cb000 ---p 00000000 00:00 0 
7fd8282cb000-7fd8283c8000 rw-p 00000000 00:00 0 
7fd8283c8000-7fd8283cc000 ---p 00000000 00:00 0 
7fd8283cc000-7fd8284c9000 rw-p 00000000 00:00 0 
7fd8284c9000-7fd8284cd000 ---p 00000000 00:00 0 
7fd8284cd000-7fd8285ca000 rw-p 00000000 00:00 0 
7fd8285ca000-7fd8285ce000 ---p 00000000 00:00 0 
7fd8285ce000-7fd8286cb000 rw-p 00000000 00:00 0 
7fd8286cb000-7fd8286cf000 ---p 00000000 00:00 0 
7fd8286cf000-7fd82c000000 rw-p 00000000 00:00 0 
7fd82c000000-7fd82c021000 rw-p 00000000 00:00 0 
7fd82c021000-7fd830000000 ---p 00000000 00:00 0 
7fd830050000-7fd830054000 ---p 00000000 00:00 0 
7fd830054000-7fd8300a3000 rw-p 00000000 00:00 0 
7fd8300a3000-7fd8300a7000 ---p 00000000 00:00 0 
7fd8300a7000-7fd8301a4000 rw-p 00000000 00:00 0 
7fd8301a4000-7fd8301a5000 ---p 00000000 00:00 0 
7fd8301a5000-7fd83049b000 rw-p 00000000 00:00 0 
7fd83049b000-7fd83049c000 ---p 00000000 00:00 0 
7fd83049c000-7fd83079f000 rw-p 00000000 00:00 0 
7fd83079f000-7fd830a0f000 rwxp 00000000 00:00 0 
7fd830a0f000-7fd830d2f000 ---p 00000000 00:00 0 
7fd830d2f000-7fd83109f000 rwxp 00000000 00:00 0 
7fd83109f000-7fd838267000 ---p 00000000 00:00 0 
7fd838267000-7fd8384d7000 rwxp 00000000 00:00 0 
7fd8384d7000-7fd83f79f000 ---p 00000000 00:00 0 
7fd83f79f000-7fd848000000 r--s 00000000 00:4d 4596317                    /opt/jdk-13.0.1+9/lib/modules
7fd848000000-7fd848bc7000 rw-p 00000000 00:00 0 
7fd848bc7000-7fd84c000000 ---p 00000000 00:00 0 
7fd84c005000-7fd84c49f000 rw-p 00000000 00:00 0 
7fd84c49f000-7fd84c4a0000 ---p 00000000 00:00 0 
7fd84c4a0000-7fd84c5a1000 rw-p 00000000 00:00 0 
7fd84c5a1000-7fd84c5a2000 ---p 00000000 00:00 0 
7fd84c5a2000-7fd84d3b0000 rw-p 00000000 00:00 0 
7fd84d3b0000-7fd84d3b6000 rw-p 00000000 00:00 0 
7fd84d3b6000-7fd84d49c000 ---p 00000000 00:00 0 
7fd84d49c000-7fd84d4a3000 rw-p 00000000 00:00 0 
7fd84d4a3000-7fd84d587000 ---p 00000000 00:00 0 
7fd84d587000-7fd84d5a2000 r-xp 00000000 00:4d 4596297                    /opt/jdk-13.0.1+9/lib/libjimage.so
7fd84d5a2000-7fd84d7a1000 ---p 0001b000 00:4d 4596297                    /opt/jdk-13.0.1+9/lib/libjimage.so
7fd84d7a1000-7fd84d7a3000 r--p 0001a000 00:4d 4596297                    /opt/jdk-13.0.1+9/lib/libjimage.so
7fd84d7a3000-7fd84d7a4000 rw-p 0001c000 00:4d 4596297                    /opt/jdk-13.0.1+9/lib/libjimage.so
7fd84d7a4000-7fd84d7ab000 r-xp 00000000 00:4d 4596316                    /opt/jdk-13.0.1+9/lib/libzip.so
7fd84d7ab000-7fd84d9aa000 ---p 00007000 00:4d 4596316                    /opt/jdk-13.0.1+9/lib/libzip.so
7fd84d9aa000-7fd84d9ab000 r--p 00006000 00:4d 4596316                    /opt/jdk-13.0.1+9/lib/libzip.so
7fd84d9ab000-7fd84d9ac000 rw-p 00007000 00:4d 4596316                    /opt/jdk-13.0.1+9/lib/libzip.so
7fd84d9ac000-7fd84d9b8000 r-xp 00000000 00:4d 3937679                    /usr/lib64/libnss_files-2.17.so
7fd84d9b8000-7fd84dbb7000 ---p 0000c000 00:4d 3937679                    /usr/lib64/libnss_files-2.17.so
7fd84dbb7000-7fd84dbb8000 r--p 0000b000 00:4d 3937679                    /usr/lib64/libnss_files-2.17.so
7fd84dbb8000-7fd84dbb9000 rw-p 0000c000 00:4d 3937679                    /usr/lib64/libnss_files-2.17.so
7fd84dbb9000-7fd84dbbf000 rw-p 00000000 00:00 0 
7fd84dbbf000-7fd84dbe7000 r-xp 00000000 00:4d 4596293                    /opt/jdk-13.0.1+9/lib/libjava.so
7fd84dbe7000-7fd84dde6000 ---p 00028000 00:4d 4596293                    /opt/jdk-13.0.1+9/lib/libjava.so
7fd84dde6000-7fd84dde7000 r--p 00027000 00:4d 4596293                    /opt/jdk-13.0.1+9/lib/libjava.so
7fd84dde7000-7fd84dde9000 rw-p 00028000 00:4d 4596293                    /opt/jdk-13.0.1+9/lib/libjava.so
7fd84dde9000-7fd84ddfa000 r-xp 00000000 00:4d 4596315                    /opt/jdk-13.0.1+9/lib/libverify.so
7fd84ddfa000-7fd84dff9000 ---p 00011000 00:4d 4596315                    /opt/jdk-13.0.1+9/lib/libverify.so
7fd84dff9000-7fd84dffb000 r--p 00010000 00:4d 4596315                    /opt/jdk-13.0.1+9/lib/libverify.so
7fd84dffb000-7fd84dffc000 rw-p 00012000 00:4d 4596315                    /opt/jdk-13.0.1+9/lib/libverify.so
7fd84dffc000-7fd84e003000 r-xp 00000000 00:4d 3938225                    /usr/lib64/librt-2.17.so
7fd84e003000-7fd84e202000 ---p 00007000 00:4d 3938225                    /usr/lib64/librt-2.17.so
7fd84e202000-7fd84e203000 r--p 00006000 00:4d 3938225                    /usr/lib64/librt-2.17.so
7fd84e203000-7fd84e204000 rw-p 00007000 00:4d 3938225                    /usr/lib64/librt-2.17.so
7fd84e204000-7fd84e305000 r-xp 00000000 00:4d 3937650                    /usr/lib64/libm-2.17.so
7fd84e305000-7fd84e504000 ---p 00101000 00:4d 3937650                    /usr/lib64/libm-2.17.so
7fd84e504000-7fd84e505000 r--p 00100000 00:4d 3937650                    /usr/lib64/libm-2.17.so
7fd84e505000-7fd84e506000 rw-p 00101000 00:4d 3937650                    /usr/lib64/libm-2.17.so
7fd84e506000-7fd84f848000 r-xp 00000000 00:4d 4984713                    /opt/jdk-13.0.1+9/lib/server/libjvm.so
7fd84f848000-7fd84fa47000 ---p 01342000 00:4d 4984713                    /opt/jdk-13.0.1+9/lib/server/libjvm.so
7fd84fa47000-7fd84fb0f000 r--p 01341000 00:4d 4984713                    /opt/jdk-13.0.1+9/lib/server/libjvm.so
7fd84fb0f000-7fd84fb4b000 rw-p 01409000 00:4d 4984713                    /opt/jdk-13.0.1+9/lib/server/libjvm.so
7fd84fb4b000-7fd84fbd0000 rw-p 00000000 00:00 0 
7fd84fbd0000-7fd84fd93000 r-xp 00000000 00:4d 3937536                    /usr/lib64/libc-2.17.so
7fd84fd93000-7fd84ff93000 ---p 001c3000 00:4d 3937536                    /usr/lib64/libc-2.17.so
7fd84ff93000-7fd84ff97000 r--p 001c3000 00:4d 3937536                    /usr/lib64/libc-2.17.so
7fd84ff97000-7fd84ff99000 rw-p 001c7000 00:4d 3937536                    /usr/lib64/libc-2.17.so
7fd84ff99000-7fd84ff9e000 rw-p 00000000 00:00 0 
7fd84ff9e000-7fd84ffa0000 r-xp 00000000 00:4d 3937565                    /usr/lib64/libdl-2.17.so
7fd84ffa0000-7fd8501a0000 ---p 00002000 00:4d 3937565                    /usr/lib64/libdl-2.17.so
7fd8501a0000-7fd8501a1000 r--p 00002000 00:4d 3937565                    /usr/lib64/libdl-2.17.so
7fd8501a1000-7fd8501a2000 rw-p 00003000 00:4d 3937565                    /usr/lib64/libdl-2.17.so
7fd8501a2000-7fd8501b9000 r-xp 00000000 00:4d 3937827                    /usr/lib64/libpthread-2.17.so
7fd8501b9000-7fd8503b8000 ---p 00017000 00:4d 3937827                    /usr/lib64/libpthread-2.17.so
7fd8503b8000-7fd8503b9000 r--p 00016000 00:4d 3937827                    /usr/lib64/libpthread-2.17.so
7fd8503b9000-7fd8503ba000 rw-p 00017000 00:4d 3937827                    /usr/lib64/libpthread-2.17.so
7fd8503ba000-7fd8503be000 rw-p 00000000 00:00 0 
7fd8503be000-7fd8503ce000 r-xp 00000000 00:4d 4596298                    /opt/jdk-13.0.1+9/lib/libjli.so
7fd8503ce000-7fd8505cd000 ---p 00010000 00:4d 4596298                    /opt/jdk-13.0.1+9/lib/libjli.so
7fd8505cd000-7fd8505ce000 r--p 0000f000 00:4d 4596298                    /opt/jdk-13.0.1+9/lib/libjli.so
7fd8505ce000-7fd8505cf000 rw-p 00010000 00:4d 4596298                    /opt/jdk-13.0.1+9/lib/libjli.so
7fd8505cf000-7fd8505e4000 r-xp 00000000 00:4d 3938287                    /usr/lib64/libz.so.1.2.7
7fd8505e4000-7fd8507e3000 ---p 00015000 00:4d 3938287                    /usr/lib64/libz.so.1.2.7
7fd8507e3000-7fd8507e4000 r--p 00014000 00:4d 3938287                    /usr/lib64/libz.so.1.2.7
7fd8507e4000-7fd8507e5000 rw-p 00015000 00:4d 3938287                    /usr/lib64/libz.so.1.2.7
7fd8507e5000-7fd850807000 r-xp 00000000 00:4d 3937512                    /usr/lib64/ld-2.17.so
7fd850818000-7fd850819000 rw-p 00000000 00:00 0 
7fd850819000-7fd85081a000 r--p 00000000 00:00 0 
7fd85081a000-7fd85081b000 rwxp 00000000 00:00 0 
7fd85081b000-7fd8508ed000 rw-p 00000000 00:00 0 
7fd8508ed000-7fd8508f4000 ---p 00000000 00:00 0 
7fd8508f4000-7fd8508fc000 rw-s 00000000 00:4d 8000748                    /tmp/hsperfdata_elasticsearch/1
7fd8508fc000-7fd850900000 ---p 00000000 00:00 0 
7fd850900000-7fd850a02000 rw-p 00000000 00:00 0 
7fd850a02000-7fd850a03000 ---p 00000000 00:00 0 
7fd850a03000-7fd850a04000 r--p 00000000 00:00 0 
7fd850a04000-7fd850a05000 ---p 00000000 00:00 0 
7fd850a05000-7fd850a06000 rw-p 00000000 00:00 0 
7fd850a06000-7fd850a07000 r--p 00021000 00:4d 3937512                    /usr/lib64/ld-2.17.so
7fd850a07000-7fd850a08000 rw-p 00022000 00:4d 3937512                    /usr/lib64/ld-2.17.so
7fd850a08000-7fd850a09000 rw-p 00000000 00:00 0 
7ffd9979b000-7ffd997bc000 rw-p 00000000 00:00 0                          [stack]
7ffd997bc000-7ffd997c0000 r--p 00000000 00:00 0                          [vvar]
7ffd997c0000-7ffd997c2000 r-xp 00000000 00:00 0                          [vdso]
ffffffffff600000-ffffffffff601000 --xp 00000000 00:00 0                  [vsyscall]


VM Arguments:
jvm_args: -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.io.tmpdir=/tmp/elasticsearch-5148610102302248278 -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=data -XX:ErrorFile=logs/hs_err_pid%p.log -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m -Djava.locale.providers=COMPAT -XX:UseAVX=2 -Xms2g -Xmx2g -Des.path.home=/usr/share/elasticsearch -Des.path.conf=/usr/share/elasticsearch/config -Des.distribution.flavor=oss -Des.distribution.type=docker 
java_command: org.elasticsearch.bootstrap.Elasticsearch -Enetwork.bind_host=0.0.0.0 -Ehttp.max_content_length=2000mb
java_class_path (initial): /usr/share/elasticsearch/lib/lucene-suggest-7.7.2.jar:/usr/share/elasticsearch/lib/lucene-misc-7.7.2.jar:/usr/share/elasticsearch/lib/jopt-simple-5.0.2.jar:/usr/share/elasticsearch/lib/HdrHistogram-2.1.9.jar:/usr/share/elasticsearch/lib/t-digest-3.2.jar:/usr/share/elasticsearch/lib/jackson-dataformat-yaml-2.8.11.jar:/usr/share/elasticsearch/lib/hppc-0.7.1.jar:/usr/share/elasticsearch/lib/jts-core-1.15.0.jar:/usr/share/elasticsearch/lib/snakeyaml-1.17.jar:/usr/share/elasticsearch/lib/elasticsearch-x-content-6.8.6.jar:/usr/share/elasticsearch/lib/lucene-analyzers-common-7.7.2.jar:/usr/share/elasticsearch/lib/jna-4.5.1.jar:/usr/share/elasticsearch/lib/lucene-highlighter-7.7.2.jar:/usr/share/elasticsearch/lib/log4j-core-2.11.1.jar:/usr/share/elasticsearch/lib/lucene-queryparser-7.7.2.jar:/usr/share/elasticsearch/lib/log4j-api-2.11.1.jar:/usr/share/elasticsearch/lib/lucene-spatial-extras-7.7.2.jar:/usr/share/elasticsearch/lib/elasticsearch-secure-sm-6.8.6.jar:/usr/share/elasticsearch/lib/jackson-core-2.8.11.jar:/usr/share/elasticsearch/lib/lucene-queries-7.7.2.jar:/usr/share/elasticsearch/lib/java-version-checker-6.8.6.jar:/usr/share/elasticsearch/lib/jackson-dataformat-smile-2.8.11.jar:/usr/share/elasticsearch/lib/elasticsearch-6.8.6.jar:/usr/share/elasticsearch/lib/lucene-spatial3d-7.7.2.jar:/usr/share/elasticsearch/lib/joda-time-2.10.1.jar:/usr/share/elasticsearch/lib/plugin-classloader-6.8.6.jar:/usr/share/elasticsearch/lib/lucene-spatial-7.7.2.jar:/usr/share/elasticsearch/lib/elasticsearch-core-6.8.6.jar:/usr/share/elasticsearch/lib/spatial4j-0.7.jar:/usr/share/elasticsearch/lib/lucene-memory-7.7.2.jar:/usr/share/elasticsearch/lib/lucene-join-7.7.2.jar:/usr/share/elasticsearch/lib/jackson-dataformat-cbor-2.8.11.jar:/usr/share/elasticsearch/lib/lucene-backward-codecs-7.7.2.jar:/usr/share/elasticsearch/lib/lucene-core-7.7.2.jar:/usr/share/elasticsearch/lib/lucene-sandbox-7.7.2.jar:/usr/share/elasticsearch/lib/log4j-1.2-api-2.11.1.jar:/usr/
Launcher Type: SUN_STANDARD

[Global flags]
     bool AlwaysPreTouch                           = true                                      {product} {command line}
     intx CICompilerCount                          = 3                                         {product} {ergonomic}
     intx CMSInitiatingOccupancyFraction           = 75                                        {product} {command line}
    ccstr ErrorFile                                = logs/hs_err_pid%p.log                     {product} {command line}
     bool HeapDumpOnOutOfMemoryError               = true                                   {manageable} {command line}
    ccstr HeapDumpPath                             = data                                   {manageable} {command line}
   size_t InitialHeapSize                          = 2147483648                                {product} {command line}
   size_t MaxHeapSize                              = 2147483648                                {product} {command line}
   size_t MaxNewSize                               = 523436032                                 {product} {ergonomic}
    uintx MaxTenuringThreshold                     = 6                                         {product} {ergonomic}
   size_t MinHeapDeltaBytes                        = 196608                                    {product} {ergonomic}
   size_t MinHeapSize                              = 2147483648                                {product} {command line}
   size_t NewSize                                  = 523436032                                 {product} {ergonomic}
    uintx NonNMethodCodeHeapSize                   = 5830732                                {pd product} {ergonomic}
    uintx NonProfiledCodeHeapSize                  = 122913754                              {pd product} {ergonomic}
   size_t OldSize                                  = 1624047616                                {product} {ergonomic}
     bool OmitStackTraceInFastThrow                = false                                     {product} {command line}
    uintx ProfiledCodeHeapSize                     = 122913754                              {pd product} {ergonomic}
    uintx ReservedCodeCacheSize                    = 251658240                              {pd product} {ergonomic}
     bool SegmentedCodeCache                       = true                                      {product} {ergonomic}
   size_t SoftMaxHeapSize                          = 2147483648                             {manageable} {ergonomic}
     intx ThreadStackSize                          = 1024                                   {pd product} {command line}
     intx UseAVX                                   = 0                                    {ARCH product} {command line}
     bool UseCMSInitiatingOccupancyOnly            = true                                      {product} {command line}
     bool UseCompressedClassPointers               = true                                 {lp64_product} {ergonomic}
     bool UseCompressedOops                        = true                                 {lp64_product} {ergonomic}
     bool UseConcMarkSweepGC                       = true                                      {product} {command line}

Logging:
Log output configuration:
 #0: stdout all=warning uptime,level,tags
 #1: stderr all=off uptime,level,tags
 #2: file=logs/gc.log all=off,gc*=info,age*=trace,safepoint=info utctime,pid,tags filecount=32,filesize=65536K

Environment Variables:
JAVA_HOME=/opt/jdk-13.0.1+9
PATH=/usr/share/elasticsearch/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

Signal Handlers:
SIGSEGV: [libjvm.so+0xfe1a10], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGBUS: [libjvm.so+0xfe1a10], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGFPE: [libjvm.so+0xfe1a10], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGPIPE: [libjvm.so+0xc7cd40], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGXFSZ: [libjvm.so+0xc7cd40], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGILL: [libjvm.so+0xfe1a10], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGUSR2: [libjvm.so+0xc7cbe0], sa_mask[0]=00000000000000000000000000000000, sa_flags=SA_RESTART|SA_SIGINFO
SIGHUP: [libjvm.so+0xc7d1e0], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGINT: [libjvm.so+0xc7d1e0], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGTERM: [libjvm.so+0xc7d1e0], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO
SIGQUIT: [libjvm.so+0xc7d1e0], sa_mask[0]=11111111011111111101111111111110, sa_flags=SA_RESTART|SA_SIGINFO


---------------  S Y S T E M  ---------------

OS:CentOS Linux release 7.7.1908 (Core)
uname:Linux 5.15.0-53-generic #59-Ubuntu SMP Mon Oct 17 18:53:30 UTC 2022 x86_64
libc:glibc 2.17 NPTL 2.17 
rlimit: STACK 8192k, CORE 0k, NPROC 15402, NOFILE 524288, AS infinity, DATA infinity, FSIZE infinity
load average:1.63 2.85 5.33

/proc/meminfo:
MemTotal:        4013784 kB
MemFree:          174192 kB
MemAvailable:     542168 kB
Buffers:           34064 kB
Cached:           566604 kB
SwapCached:       142180 kB
Active:           511156 kB
Inactive:        2975916 kB
Active(anon):     222212 kB
Inactive(anon):  2744984 kB
Active(file):     288944 kB
Inactive(file):   230932 kB
Unevictable:       23712 kB
Mlocked:              32 kB
SwapTotal:       3993596 kB
SwapFree:        2874288 kB
Dirty:               148 kB
Writeback:             0 kB
AnonPages:       2782812 kB
Mapped:           296696 kB
Shmem:             80768 kB
KReclaimable:     113224 kB
Slab:             211796 kB
SReclaimable:     113224 kB
SUnreclaim:        98572 kB
KernelStack:       16064 kB
PageTables:        33836 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:     6000488 kB
Committed_AS:   11831144 kB
VmallocTotal:   34359738367 kB
VmallocUsed:       36380 kB
VmallocChunk:          0 kB
Percpu:             7968 kB
HardwareCorrupted:     0 kB
AnonHugePages:         0 kB
ShmemHugePages:        0 kB
ShmemPmdMapped:        0 kB
FileHugePages:         0 kB
FilePmdMapped:         0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
Hugetlb:               0 kB
DirectMap4k:      244676 kB
DirectMap2M:     3946496 kB


/proc/sys/kernel/threads-max (system-wide limit on the number of threads):
30805


/proc/sys/vm/max_map_count (maximum number of memory map areas a process may have):
65530


/proc/sys/kernel/pid_max (system-wide limit on number of process identifiers):
4194304



Steal ticks since vm start: 0
Steal ticks percentage since vm start:  0.000

CPU:total 6 (initial active 6) (6 cores per cpu, 1 threads per core) family 15 model 107 stepping 1, cmov, cx8, fxsr, mmx, sse, sse2, sse3, tsc
CPU Model and flags from /proc/cpuinfo:
model name	: QEMU Virtual CPU version 2.5+
flags		: fpu de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx lm rep_good nopl cpuid extd_apicid pni cx16 hypervisor lahf_lm cmp_legacy svm 3dnowprefetch vmmcall

Memory: 4k page, physical 4013784k(174192k free), swap 3993596k(2874288k free)

vm_info: OpenJDK 64-Bit Server VM (13.0.1+9) for linux-amd64 JRE (13.0.1+9), built on Oct 26 2019 10:07:46 by "jenkins" with gcc 7.3.1 20180303 (Red Hat 7.3.1-5)

END.

Full docker-compose logs below

elasticsearch_1  | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
elasticsearch_1  | OpenJDK 64-Bit Server VM warning: UseAVX=2 is not supported on this CPU, setting it to UseAVX=0
mariadb_1        | 2022-11-22 22:29:34+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
mariadb_1        | 2022-11-22 22:29:39+00:00 [Note] [Entrypoint]: Switching to dedicated user 'mysql'
mariadb_1        | 2022-11-22 22:29:40+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
mariadb_1        | 2022-11-22 22:29:43+00:00 [Note] [Entrypoint]: Initializing database files
hatstall_1       | Running Hatstall
hatstall_1       | Site 000-default disabled.
hatstall_1       | To activate the new configuration, you need to run:
hatstall_1       |   service apache2 reload
hatstall_1       | Enabling site apache-hatstall.
mariadb_1        | 
mariadb_1        | 
mariadb_1        | PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !
mariadb_1        | To do so, start the server, then issue the following command:
mariadb_1        | 
mariadb_1        | '/usr/bin/mysql_secure_installation'
mariadb_1        | 
mariadb_1        | which will also give you the option of removing the test
mariadb_1        | databases and anonymous user created by default.  This is
mariadb_1        | strongly recommended for production servers.
mariadb_1        | 
mariadb_1        | See the MariaDB Knowledgebase at https://mariadb.com/kb
mariadb_1        | 
mariadb_1        | Please report any problems at https://mariadb.org/jira
mariadb_1        | 
mariadb_1        | The latest information about MariaDB is available at https://mariadb.org/.
mariadb_1        | 
mariadb_1        | Consider joining MariaDB's strong and vibrant community:
mariadb_1        | https://mariadb.org/get-involved/
mariadb_1        | 
mariadb_1        | 2022-11-22 22:30:07+00:00 [Note] [Entrypoint]: Database files initialized
hatstall_1       | To activate the new configuration, you need to run:
hatstall_1       |   service apache2 reload
mariadb_1        | 2022-11-22 22:30:07+00:00 [Note] [Entrypoint]: Starting temporary server
mariadb_1        | 2022-11-22 22:30:07+00:00 [Note] [Entrypoint]: Waiting for server startup
mariadb_1        | 2022-11-22 22:30:08 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 96 ...
mariadb_1        | 2022-11-22 22:30:09 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mariadb_1        | 2022-11-22 22:30:09 0 [Note] InnoDB: Number of pools: 1
mariadb_1        | 2022-11-22 22:30:09 0 [Note] InnoDB: Using generic crc32 instructions
mariadb_1        | 2022-11-22 22:30:09 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
mariadb_1        | 2022-11-22 22:30:10 0 [Note] InnoDB: Using Linux native AIO
hatstall_1       | Considering dependency setenvif for ssl:
hatstall_1       | Module setenvif already enabled
hatstall_1       | Considering dependency mime for ssl:
hatstall_1       | Module mime already enabled
hatstall_1       | Considering dependency socache_shmcb for ssl:
hatstall_1       | Enabling module socache_shmcb.
mariadb_1        | 2022-11-22 22:30:10 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
mariadb_1        | 2022-11-22 22:30:10 0 [Note] InnoDB: Completed initialization of buffer pool
hatstall_1       | Enabling module ssl.
hatstall_1       | See /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.
hatstall_1       | To activate the new configuration, you need to run:
hatstall_1       |   service apache2 restart
mariadb_1        | 2022-11-22 22:30:11 0 [Note] InnoDB: 128 rollback segments are active.
mariadb_1        | 2022-11-22 22:30:11 0 [Note] InnoDB: Creating shared tablespace for temporary tables
mariadb_1        | 2022-11-22 22:30:11 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
mariadb_1        | 2022-11-22 22:30:11 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mariadb_1        | 2022-11-22 22:30:12 0 [Note] InnoDB: 10.6.11 started; log sequence number 42120; transaction id 14
mariadb_1        | 2022-11-22 22:30:12 0 [Note] Plugin 'FEEDBACK' is disabled.
mariadb_1        | 2022-11-22 22:30:12 0 [Warning] 'user' entry 'root@c11314e396b0' ignored in --skip-name-resolve mode.
mariadb_1        | 2022-11-22 22:30:13 0 [Warning] 'proxies_priv' entry '@% root@c11314e396b0' ignored in --skip-name-resolve mode.
mariadb_1        | 2022-11-22 22:30:13 0 [Note] mariadbd: ready for connections.
mariadb_1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 0  mariadb.org binary distribution
mariadb_1        | 2022-11-22 22:30:14+00:00 [Note] [Entrypoint]: Temporary server started.
hatstall_1       | Django configured for deployment (secret, debug, allowed_hosts) in django_hatstall/settings.py
hatstall_1       | No changes detected
hatstall_1       | Operations to perform:
hatstall_1       |   Apply all migrations: admin, auth, contenttypes, sessions
hatstall_1       | Running migrations:
hatstall_1       |   Applying contenttypes.0001_initial... OK
hatstall_1       |   Applying auth.0001_initial... OK
hatstall_1       |   Applying admin.0001_initial... OK
hatstall_1       |   Applying admin.0002_logentry_remove_auto_add... OK
hatstall_1       |   Applying admin.0003_logentry_add_action_flag_choices... OK
hatstall_1       |   Applying contenttypes.0002_remove_content_type_name... OK
hatstall_1       |   Applying auth.0002_alter_permission_name_max_length... OK
hatstall_1       |   Applying auth.0003_alter_user_email_max_length... OK
hatstall_1       |   Applying auth.0004_alter_user_username_opts... OK
hatstall_1       |   Applying auth.0005_alter_user_last_login_null... OK
hatstall_1       |   Applying auth.0006_require_contenttypes_0002... OK
hatstall_1       |   Applying auth.0007_alter_validators_add_error_messages... OK
hatstall_1       |   Applying auth.0008_alter_user_username_max_length... OK
hatstall_1       |   Applying auth.0009_alter_user_last_name_max_length... OK
hatstall_1       |   Applying auth.0010_alter_group_name_max_length... OK
hatstall_1       |   Applying auth.0011_update_proxy_permissions... OK
hatstall_1       |   Applying auth.0012_alter_user_first_name_max_length... OK
hatstall_1       |   Applying sessions.0001_initial... OK
mariadb_1        | 2022-11-22 22:31:01+00:00 [Note] [Entrypoint]: Securing system users (equivalent to running mysql_secure_installation)
mariadb_1        | 
mariadb_1        | 2022-11-22 22:31:02+00:00 [Note] [Entrypoint]: Stopping temporary server
mariadb_1        | 2022-11-22 22:31:02 0 [Note] mariadbd (initiated by: unknown): Normal shutdown
mariadb_1        | 2022-11-22 22:31:03 0 [Note] InnoDB: FTS optimize thread exiting.
mariadb_1        | 2022-11-22 22:31:03 0 [Note] InnoDB: Starting shutdown...
mariadb_1        | 2022-11-22 22:31:03 0 [Note] InnoDB: Dumping buffer pool(s) to /var/lib/mysql/ib_buffer_pool
mariadb_1        | 2022-11-22 22:31:03 0 [Note] InnoDB: Buffer pool(s) dump completed at 221122 22:31:03
mariadb_1        | 2022-11-22 22:31:04 0 [Note] InnoDB: Removed temporary tablespace data file: "./ibtmp1"
mariadb_1        | 2022-11-22 22:31:04 0 [Note] InnoDB: Shutdown completed; log sequence number 42132; transaction id 15
mariadb_1        | 2022-11-22 22:31:04 0 [Note] mariadbd: Shutdown complete
mariadb_1        | 
mariadb_1        | 2022-11-22 22:31:05+00:00 [Note] [Entrypoint]: Temporary server stopped
mariadb_1        | 
mariadb_1        | 2022-11-22 22:31:05+00:00 [Note] [Entrypoint]: MariaDB init process done. Ready for start up.
mariadb_1        | 
mariadb_1        | 2022-11-22 22:31:06 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 1 ...
mariadb_1        | 2022-11-22 22:31:07 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mariadb_1        | 2022-11-22 22:31:08 0 [Note] InnoDB: Number of pools: 1
mariadb_1        | 2022-11-22 22:31:08 0 [Note] InnoDB: Using generic crc32 instructions
mariadb_1        | 2022-11-22 22:31:08 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
mariadb_1        | 2022-11-22 22:31:08 0 [Note] InnoDB: Using Linux native AIO
mariadb_1        | 2022-11-22 22:31:08 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
mariadb_1        | 2022-11-22 22:31:08 0 [Note] InnoDB: Completed initialization of buffer pool
mariadb_1        | 2022-11-22 22:31:09 0 [Note] InnoDB: 128 rollback segments are active.
mariadb_1        | 2022-11-22 22:31:09 0 [Note] InnoDB: Creating shared tablespace for temporary tables
mariadb_1        | 2022-11-22 22:31:09 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
mariadb_1        | 2022-11-22 22:31:09 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mariadb_1        | 2022-11-22 22:31:10 0 [Note] InnoDB: 10.6.11 started; log sequence number 42132; transaction id 14
mariadb_1        | 2022-11-22 22:31:10 0 [Note] Plugin 'FEEDBACK' is disabled.
mariadb_1        | 2022-11-22 22:31:10 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
mariadb_1        | 2022-11-22 22:31:10 0 [Warning] You need to use --log-bin to make --expire-logs-days or --binlog-expire-logs-seconds work.
mariadb_1        | 2022-11-22 22:31:10 0 [Note] InnoDB: Buffer pool(s) load completed at 221122 22:31:10
mariadb_1        | 2022-11-22 22:31:10 0 [Note] Server socket created on IP: '0.0.0.0'.
mariadb_1        | 2022-11-22 22:31:10 0 [Note] Server socket created on IP: '::'.
mariadb_1        | 2022-11-22 22:31:11 0 [Note] mariadbd: ready for connections.
mariadb_1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 3306  mariadb.org binary distribution
hatstall_1       | 
hatstall_1       | 163 static files copied to '/home/grimoirelab/grimoirelab-hatstall/django-hatstall/static'.
hatstall_1       | User for django admin created: admin/admin as login
hatstall_1       | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.18.0.6. Set the 'ServerName' directive globally to suppress this message
hatstall_1       | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.18.0.6. Set the 'ServerName' directive globally to suppress this message
hatstall_1       | [Tue Nov 22 22:31:21.452974 2022] [mpm_event:notice] [pid 71:tid 139837564859520] AH00489: Apache/2.4.38 (Debian) OpenSSL/1.1.1d mod_wsgi/4.6.5 Python/3.7 configured -- resuming normal operations
hatstall_1       | [Tue Nov 22 22:31:21.475506 2022] [core:notice] [pid 71:tid 139837564859520] AH00094: Command line: '/usr/sbin/apache2 -D FOREGROUND'
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:44Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:45Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"yellow","message":"Status changed from uninitialized to yellow - Waiting for Elasticsearch","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:45Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:45Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:45Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:45Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:47Z","tags":["status","plugin:[email protected]","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:31:51Z","tags":["status","plugin:[email protected]","error"],"pid":1,"state":"red","message":"Status changed from yellow to red - Request Timeout after 3000ms","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
elasticsearch_1  | #
elasticsearch_1  | # A fatal error has been detected by the Java Runtime Environment:
elasticsearch_1  | #
elasticsearch_1  | #  SIGSEGV (0xb) at pc=0x00007fd84ec6d074, pid=1, tid=71
elasticsearch_1  | #
elasticsearch_1  | # JRE version: OpenJDK Runtime Environment (13.0.1+9) (build 13.0.1+9)
elasticsearch_1  | # Java VM: OpenJDK 64-Bit Server VM (13.0.1+9, mixed mode, sharing, tiered, compressed oops, concurrent mark sweep gc, linux-amd64)
elasticsearch_1  | # Problematic frame:
elasticsearch_1  | # V  [libjvm.so+0x767074]  frame::frame(long*, long*, long*, unsigned char*)+0xc4
elasticsearch_1  | #
elasticsearch_1  | # Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -- %E" (or dumping to /usr/share/elasticsearch/core.1)
elasticsearch_1  | #
elasticsearch_1  | # An error report file with more information is saved as:
elasticsearch_1  | # logs/hs_err_pid1.log
elasticsearch_1  | Compiled method (c1)  690885 1562       3       sun.nio.fs.UnixPath::compareTo (92 bytes)
elasticsearch_1  |  total in heap  [0x00007fd830fe8410,0x00007fd830fe8db8] = 2472
elasticsearch_1  |  relocation     [0x00007fd830fe8570,0x00007fd830fe85d8] = 104
elasticsearch_1  |  main code      [0x00007fd830fe85e0,0x00007fd830fe8b20] = 1344
elasticsearch_1  |  stub code      [0x00007fd830fe8b20,0x00007fd830fe8bb0] = 144
elasticsearch_1  |  metadata       [0x00007fd830fe8bb0,0x00007fd830fe8bc8] = 24
elasticsearch_1  |  scopes data    [0x00007fd830fe8bc8,0x00007fd830fe8c78] = 176
elasticsearch_1  |  scopes pcs     [0x00007fd830fe8c78,0x00007fd830fe8d78] = 256
elasticsearch_1  |  dependencies   [0x00007fd830fe8d78,0x00007fd830fe8d80] = 8
elasticsearch_1  |  nul chk table  [0x00007fd830fe8d80,0x00007fd830fe8db8] = 56
elasticsearch_1  | Compiled method (c1)  690886 1562       3       sun.nio.fs.UnixPath::compareTo (92 bytes)
elasticsearch_1  |  total in heap  [0x00007fd830fe8410,0x00007fd830fe8db8] = 2472
elasticsearch_1  |  relocation     [0x00007fd830fe8570,0x00007fd830fe85d8] = 104
elasticsearch_1  |  main code      [0x00007fd830fe85e0,0x00007fd830fe8b20] = 1344
elasticsearch_1  |  stub code      [0x00007fd830fe8b20,0x00007fd830fe8bb0] = 144
elasticsearch_1  |  metadata       [0x00007fd830fe8bb0,0x00007fd830fe8bc8] = 24
elasticsearch_1  |  scopes data    [0x00007fd830fe8bc8,0x00007fd830fe8c78] = 176
elasticsearch_1  |  scopes pcs     [0x00007fd830fe8c78,0x00007fd830fe8d78] = 256
elasticsearch_1  |  dependencies   [0x00007fd830fe8d78,0x00007fd830fe8d80] = 8
elasticsearch_1  |  nul chk table  [0x00007fd830fe8d80,0x00007fd830fe8db8] = 56
elasticsearch_1  | #
elasticsearch_1  | # If you would like to submit a bug report, please visit:
elasticsearch_1  | #   https://github.com/AdoptOpenJDK/openjdk-build/issues
elasticsearch_1  | #
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | 
elasticsearch_1  | 
elasticsearch_1  | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fd84fc07b77]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
...
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
elasticsearch_1  | [Too many errors, abort]
docker-compose_elasticsearch_1 exited with code 139
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:33Z","tags":["error","elasticsearch","admin"],"pid":1,"message":"Request error, retrying\nHEAD http://elasticsearch:9200/ => getaddrinfo EAI_AGAIN elasticsearch elasticsearch:9200"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:33Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"Unable to revive connection: http://elasticsearch:9200/"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:33Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:33Z","tags":["status","plugin:[email protected]","error"],"pid":1,"state":"red","message":"Status changed from red to red - Unable to connect to Elasticsearch.","prevState":"red","prevMsg":"Request Timeout after 3000ms"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:36Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"Unable to revive connection: http://elasticsearch:9200/"}
kibiter_1        | {"type":"log","@timestamp":"2022-11-22T22:41:36Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}

Am I doing something wrong? I believe I've set up everything correctly.

I had to modify the grimoirelab/docker-compose/docker-compose.yml to include the mounted volume for the .ssh directory in order to get access to Gerrit like mentioned above, but all I did was add a volume section like so

kibiter:
  restart: on-failure:5
  image: bitergia/kibiter:community-v6.8.6-3
  environment:
    - PROJECT_NAME=Demo
    - NODE_OPTIONS=--max-old-space-size=1000
    - ELASTICSEARCH_URL=http://elasticsearch:9200
  links:
    - elasticsearch
  ports:
    - 5601:5601
  volumes:
    - ~/.ssh/:/home/bitergia/.ssh

@loganknecht
Copy link
Author

Hey @zhquan, @jjmerchante, and @sduenas, 👋

Apologies for being a nuisance on this, but am I doing anything particularly wrong in my configuration?

@sduenas
Copy link
Member

sduenas commented Nov 29, 2022

Hey @zhquan, @jjmerchante, and @sduenas, wave

Apologies for being a nuisance on this, but am I doing anything particularly wrong in my configuration?

I don't think you have anything wrong with your configuration. Before using this emulation of Ubuntu, did you try to run the docker-compose in your machine using the grimoirelab image that we created for Mac?

@loganknecht
Copy link
Author

loganknecht commented Nov 29, 2022

Hey @sduenas

When you say

did you try to run the docker-compose in your machine using the grimoirelab image that we created for Mac?

Can you elaborate a bit on that?

I have tried cloning the repo today, modifying the projects.json, the setup.cfg, and mounting my .ssh into the kibiter service using

    kibiter:
        restart: on-failure:5
        image: bitergia/kibiter:community-v6.8.6-3
        environment:
            - PROJECT_NAME=Demo
            - NODE_OPTIONS=--max-old-space-size=1000
            - ELASTICSEARCH_URL=http://elasticsearch:9200
        links:
            - elasticsearch
        ports:
            - 5601:5601
        volumes:
            - ~/.ssh/:/home/bitergia/.ssh

When I do this, the docker-compose up -d command still fails with the errors I first provided in this issue.

Logs below

 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dc up
[+] Running 5/5
 ⠿ Container docker-compose-mariadb-1        Created                                                                       0.1s
 ⠿ Container docker-compose-elasticsearch-1  Created                                                                       0.1s
 ⠿ Container docker-compose-hatstall-1       Created                                                                       0.1s
 ⠿ Container docker-compose-kibiter-1        Created                                                                       0.1s
 ⠿ Container docker-compose-mordred-1        Created                                                                       0.1s
Attaching to docker-compose-elasticsearch-1, docker-compose-hatstall-1, docker-compose-kibiter-1, docker-compose-mariadb-1, docker-compose-mordred-1
docker-compose-mariadb-1        | 2022-11-29 23:20:27+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
docker-compose-mariadb-1        | 2022-11-29 23:20:27+00:00 [Note] [Entrypoint]: Switching to dedicated user 'mysql'
docker-compose-mariadb-1        | 2022-11-29 23:20:27+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
docker-compose-mariadb-1        | 2022-11-29 23:20:28+00:00 [Note] [Entrypoint]: Initializing database files
docker-compose-hatstall-1       | Running Hatstall
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Site 000-default disabled.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 reload
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-mariadb-1        |
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !
docker-compose-mariadb-1        | To do so, start the server, then issue the following command:
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | '/usr/bin/mysql_secure_installation'
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | which will also give you the option of removing the test
docker-compose-mariadb-1        | databases and anonymous user created by default.  This is
docker-compose-mariadb-1        | strongly recommended for production servers.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | See the MariaDB Knowledgebase at https://mariadb.com/kb
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | Please report any problems at https://mariadb.org/jira
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | The latest information about MariaDB is available at https://mariadb.org/.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | Consider joining MariaDB's strong and vibrant community:
docker-compose-mariadb-1        | https://mariadb.org/get-involved/
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:20:29+00:00 [Note] [Entrypoint]: Database files initialized
docker-compose-mariadb-1        | 2022-11-29 23:20:29+00:00 [Note] [Entrypoint]: Starting temporary server
docker-compose-mariadb-1        | 2022-11-29 23:20:29+00:00 [Note] [Entrypoint]: Waiting for server startup
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Enabling site apache-hatstall.
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 92 ...
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Number of pools: 1
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Using Linux native AIO
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Completed initialization of buffer pool
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: 128 rollback segments are active.
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Creating shared tablespace for temporary tables
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] InnoDB: 10.6.11 started; log sequence number 42266; transaction id 14
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] Plugin 'FEEDBACK' is disabled.
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Warning] 'user' entry 'root@30d61cce682f' ignored in --skip-name-resolve mode.
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Warning] 'proxies_priv' entry '@% root@30d61cce682f' ignored in --skip-name-resolve mode.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 reload
docker-compose-mariadb-1        | 2022-11-29 23:20:29 0 [Note] mariadbd: ready for connections.
docker-compose-mariadb-1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 0  mariadb.org binary distribution
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-hatstall-1       | Considering dependency setenvif for ssl:
docker-compose-hatstall-1       | Module setenvif already enabled
docker-compose-hatstall-1       | Considering dependency mime for ssl:
docker-compose-hatstall-1       | Module mime already enabled
docker-compose-hatstall-1       | Considering dependency socache_shmcb for ssl:
docker-compose-hatstall-1       | Enabling module socache_shmcb.
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Enabling module ssl.
docker-compose-hatstall-1       | See /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 restart
docker-compose-mariadb-1        | 2022-11-29 23:20:30+00:00 [Note] [Entrypoint]: Temporary server started.
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Django configured for deployment (secret, debug, allowed_hosts) in django_hatstall/settings.py
docker-compose-mariadb-1        | 2022-11-29 23:20:31+00:00 [Note] [Entrypoint]: Securing system users (equivalent to running mysql_secure_installation)
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:20:31+00:00 [Note] [Entrypoint]: Stopping temporary server
docker-compose-mariadb-1        | 2022-11-29 23:20:31 0 [Note] mariadbd (initiated by: unknown): Normal shutdown
docker-compose-mariadb-1        | 2022-11-29 23:20:31 0 [Note] InnoDB: FTS optimize thread exiting.
docker-compose-mariadb-1        | 2022-11-29 23:20:31 0 [Note] InnoDB: Starting shutdown...
docker-compose-mariadb-1        | 2022-11-29 23:20:31 0 [Note] InnoDB: Dumping buffer pool(s) to /var/lib/mysql/ib_buffer_pool
docker-compose-mariadb-1        | 2022-11-29 23:20:31 0 [Note] InnoDB: Buffer pool(s) dump completed at 221129 23:20:31
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Removed temporary tablespace data file: "./ibtmp1"
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Shutdown completed; log sequence number 42278; transaction id 15
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] mariadbd: Shutdown complete
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:20:32+00:00 [Note] [Entrypoint]: Temporary server stopped
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:20:32+00:00 [Note] [Entrypoint]: MariaDB init process done. Ready for start up.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 1 ...
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Number of pools: 1
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Using Linux native AIO
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Completed initialization of buffer pool
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: 128 rollback segments are active.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Creating shared tablespace for temporary tables
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: 10.6.11 started; log sequence number 42278; transaction id 14
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] Plugin 'FEEDBACK' is disabled.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] InnoDB: Buffer pool(s) load completed at 221129 23:20:32
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Warning] You need to use --log-bin to make --expire-logs-days or --binlog-expire-logs-seconds work.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] Server socket created on IP: '0.0.0.0'.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] Server socket created on IP: '::'.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] mariadbd: ready for connections.
docker-compose-mariadb-1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 3306  mariadb.org binary distribution
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-elasticsearch-1  | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
docker-compose-elasticsearch-1  | OpenJDK 64-Bit Server VM warning: UseAVX=2 is not supported on this CPU, setting it to UseAVX=0
^CGracefully stopping... (press Ctrl+C again to force)
[+] Running 1/3
[+] Running 1/3cker-compose-hatstall-1  Stopping                                                                           3.5s
 ⠴ Container docker-compose-hatstall-1  Stopping                                                                           3.6s
 ⠴ Container docker-compose-mordred-1   Stopping                                                                           3.6s
 ⠿ Container docker-compose-kibiter-1   Stopped                                                                            0.0s
[+] Running 3/4
[+] Running 5/5cker-compose-elasticsearch-1  Killing                                                                       0.5s
 ⠿ Container docker-compose-hatstall-1       Stopped                                                                       3.9s
 ⠿ Container docker-compose-mordred-1        Stopped                                                                       4.0s
 ⠿ Container docker-compose-kibiter-1        Stopped                                                                       0.0s
 ⠿ Container docker-compose-mariadb-1        Stopped                                                                       0.0s
 ⠿ Container docker-compose-elasticsearch-1  Stopped                                                                       0.4s
canceled

 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 clear
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 git stash
Saved working directory and index state WIP on master: abf22d1 Release 0.7.1
lknecht ~/Repositories/grimoirelab/docker-compose   master ⬡ v16.15.1 gn
NUKING GIT CONFIGURATIONS BACK TO HEAD
HEAD is now at abf22d1 Release 0.7.1
Already up to date.
lknecht ~/Repositories/grimoirelab/docker-compose   master ⬡ v16.15.1 git stash apply
On branch master
Your branch is up to date with 'origin/master'.

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git restore <file>..." to discard changes in working directory)
	modified:   ../default-grimoirelab-settings/projects.json
	modified:   ../default-grimoirelab-settings/setup.cfg
	modified:   docker-compose.yml

no changes added to commit (use "git add" and/or "git commit -a")
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1
 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 clear
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dnc
0c7329cf2244
8a3dde65cbc5
f60edf2bc4e4
30d61cce682f
5db890d0f724
1be42e62d593
dd203acbef90
1a4303bd727b
4a384ef9363f
650dbdca02c3
83327fe87887
9aaba81095e0
678d20e9537b
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dnv
0c35c372a042a634408d1f92bdbd656704449f502324cbf3d3ccb63ea98ae45c
84f6648003aff4580f55eea0c76f5fa0fb648cd4cd3b7371a8d15e8a986aa9a3
Error: No such volume: local
Error: No such volume: local
 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 clear
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 >....
og-expire-logs-seconds work.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] Server socket created on IP: '0.0.0.0'.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] Server socket created on IP: '::'.
docker-compose-mariadb-1        | 2022-11-29 23:20:32 0 [Note] mariadbd: ready for connections.
docker-compose-mariadb-1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 3306  mariadb.org binary distribution
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        |       /usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2clear
zsh: parse error near `)'
 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 la
total 192
-rw-r--r--  1 lknecht  MAGICLEAP\ml   6.1K Oct 18 08:16 README.md
-rw-r--r--  1 lknecht  MAGICLEAP\ml   2.2K Nov 29 15:19 docker-compose-opensearch.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml   1.9K Oct 18 08:16 docker-compose-secured.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml   1.9K Nov 29 15:20 docker-compose.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml    73K Oct  3 09:21 mac-docker-configuration.png
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 clear
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 docker-compose up -d
[+] Running 5/5
 ⠿ Container docker-compose-mariadb-1        Started                                                                       0.6s
 ⠿ Container docker-compose-elasticsearch-1  Started                                                                       0.8s
 ⠿ Container docker-compose-kibiter-1        Started                                                                       1.3s
 ⠿ Container docker-compose-hatstall-1       Started                                                                       1.0s
 ⠿ Container docker-compose-mordred-1        Started                                                                       1.2s
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1
 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dnc
9b5e13ec8cce
5e156b28d87a
a0f1952ebe7d
7518ecee00a9
8d8dda7cb880
20137e122682
98b7931b0e2f
5c3e59ca9f8b
01b09c91d8e7
95c597f4ec88
ff9a8304fcb8
0a95306adcf4
7bc0c00bf3c1
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dnv
0f5874b4d2d3947f6174a65c3f4a9776d7e8781e95b76b71aa0f55c3a18cfcc3
4c5b03e26953168ae9c7b3f8fe243231736ecbccdcda018a95e85d149abfe4c8
Error: No such volume: local
Error: No such volume: local
 ✘ lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 clear
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 la
total 192
-rw-r--r--  1 lknecht  MAGICLEAP\ml   6.1K Oct 18 08:16 README.md
-rw-r--r--  1 lknecht  MAGICLEAP\ml   2.2K Nov 29 15:19 docker-compose-opensearch.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml   1.9K Oct 18 08:16 docker-compose-secured.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml   1.9K Nov 29 15:20 docker-compose.yml
-rw-r--r--  1 lknecht  MAGICLEAP\ml    73K Oct  3 09:21 mac-docker-configuration.png
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 dnc
b639b9a078f6
3e399f2ab4e9
021d738efbee
91b92c713d2d
029587cb9428
6ab91a7b6048
5a282f500e5c
a8224128d842
lknecht ~/Repositories/grimoirelab/docker-compose   master ±⬡ v16.15.1 docker-compose up
[+] Running 5/5
 ⠿ Container docker-compose-elasticsearch-1  Created                                                                       0.1s
 ⠿ Container docker-compose-mariadb-1        Created                                                                       0.1s
 ⠿ Container docker-compose-hatstall-1       Created                                                                       0.1s
 ⠿ Container docker-compose-mordred-1        Created                                                                       0.1s
 ⠿ Container docker-compose-kibiter-1        Created                                                                       0.1s
Attaching to docker-compose-elasticsearch-1, docker-compose-hatstall-1, docker-compose-kibiter-1, docker-compose-mariadb-1, docker-compose-mordred-1
docker-compose-mariadb-1        | 2022-11-29 23:24:10+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
docker-compose-mariadb-1        | 2022-11-29 23:24:10+00:00 [Note] [Entrypoint]: Switching to dedicated user 'mysql'
docker-compose-mariadb-1        | 2022-11-29 23:24:10+00:00 [Note] [Entrypoint]: Entrypoint script for MariaDB Server 1:10.6.11+maria~ubu2004 started.
docker-compose-mariadb-1        | 2022-11-29 23:24:11+00:00 [Note] [Entrypoint]: Initializing database files
docker-compose-hatstall-1       | Running Hatstall
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Site 000-default disabled.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 reload
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-mariadb-1        |
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !
docker-compose-mariadb-1        | To do so, start the server, then issue the following command:
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | '/usr/bin/mysql_secure_installation'
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | which will also give you the option of removing the test
docker-compose-mariadb-1        | databases and anonymous user created by default.  This is
docker-compose-mariadb-1        | strongly recommended for production servers.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | See the MariaDB Knowledgebase at https://mariadb.com/kb
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | Please report any problems at https://mariadb.org/jira
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | The latest information about MariaDB is available at https://mariadb.org/.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | Consider joining MariaDB's strong and vibrant community:
docker-compose-mariadb-1        | https://mariadb.org/get-involved/
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:24:12+00:00 [Note] [Entrypoint]: Database files initialized
docker-compose-mariadb-1        | 2022-11-29 23:24:12+00:00 [Note] [Entrypoint]: Starting temporary server
docker-compose-mariadb-1        | 2022-11-29 23:24:12+00:00 [Note] [Entrypoint]: Waiting for server startup
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 92 ...
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Number of pools: 1
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Using Linux native AIO
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Completed initialization of buffer pool
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: 128 rollback segments are active.
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Creating shared tablespace for temporary tables
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] InnoDB: 10.6.11 started; log sequence number 42120; transaction id 14
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] Plugin 'FEEDBACK' is disabled.
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Warning] 'user' entry 'root@8fd2ad4d033a' ignored in --skip-name-resolve mode.
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Warning] 'proxies_priv' entry '@% root@8fd2ad4d033a' ignored in --skip-name-resolve mode.
docker-compose-mariadb-1        | 2022-11-29 23:24:12 0 [Note] mariadbd: ready for connections.
docker-compose-mariadb-1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 0  mariadb.org binary distribution
docker-compose-hatstall-1       | Enabling site apache-hatstall.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 reload
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Considering dependency setenvif for ssl:
docker-compose-hatstall-1       | Module setenvif already enabled
docker-compose-hatstall-1       | Considering dependency mime for ssl:
docker-compose-hatstall-1       | Module mime already enabled
docker-compose-hatstall-1       | Considering dependency socache_shmcb for ssl:
docker-compose-hatstall-1       | Enabling module socache_shmcb.
docker-compose-hatstall-1       | Enabling module ssl.
docker-compose-hatstall-1       | See /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.
docker-compose-hatstall-1       | To activate the new configuration, you need to run:
docker-compose-hatstall-1       |   service apache2 restart
docker-compose-mariadb-1        | 2022-11-29 23:24:13+00:00 [Note] [Entrypoint]: Temporary server started.
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Django configured for deployment (secret, debug, allowed_hosts) in django_hatstall/settings.py
docker-compose-mariadb-1        | 2022-11-29 23:24:14+00:00 [Note] [Entrypoint]: Securing system users (equivalent to running mysql_secure_installation)
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:24:14+00:00 [Note] [Entrypoint]: Stopping temporary server
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] mariadbd (initiated by: unknown): Normal shutdown
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: FTS optimize thread exiting.
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Starting shutdown...
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Dumping buffer pool(s) to /var/lib/mysql/ib_buffer_pool
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Buffer pool(s) dump completed at 221129 23:24:14
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Removed temporary tablespace data file: "./ibtmp1"
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Shutdown completed; log sequence number 42132; transaction id 15
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] mariadbd: Shutdown complete
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:24:14+00:00 [Note] [Entrypoint]: Temporary server stopped
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:24:14+00:00 [Note] [Entrypoint]: MariaDB init process done. Ready for start up.
docker-compose-mariadb-1        |
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] mariadbd (server 10.6.11-MariaDB-1:10.6.11+maria~ubu2004) starting as process 1 ...
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Number of pools: 1
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Using ARMv8 crc32 + pmull instructions
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] mariadbd: O_TMPFILE is not supported on /tmp (disabling future attempts)
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Using Linux native AIO
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Initializing buffer pool, total size = 134217728, chunk size = 134217728
docker-compose-mariadb-1        | 2022-11-29 23:24:14 0 [Note] InnoDB: Completed initialization of buffer pool
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: 128 rollback segments are active.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: Creating shared tablespace for temporary tables
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: 10.6.11 started; log sequence number 42132; transaction id 14
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] Plugin 'FEEDBACK' is disabled.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] InnoDB: Buffer pool(s) load completed at 221129 23:24:15
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Warning] You need to use --log-bin to make --expire-logs-days or --binlog-expire-logs-seconds work.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] Server socket created on IP: '0.0.0.0'.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] Server socket created on IP: '::'.
docker-compose-mariadb-1        | 2022-11-29 23:24:15 0 [Note] mariadbd: ready for connections.
docker-compose-mariadb-1        | Version: '10.6.11-MariaDB-1:10.6.11+maria~ubu2004'  socket: '/run/mysqld/mysqld.sock'  port: 3306  mariadb.org binary distribution
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-elasticsearch-1  | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
docker-compose-elasticsearch-1  | OpenJDK 64-Bit Server VM warning: UseAVX=2 is not supported on this CPU, setting it to UseAVX=0
docker-compose-hatstall-1       | No changes detected
docker-compose-kibiter-1        | runtime: failed to create new OS thread (have 2 already; errno=22)
docker-compose-kibiter-1        | fatal error: runtime.newosproc
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | runtime stack:
docker-compose-kibiter-1        | runtime.throw(0x56be06)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/panic.go:491 +0xad
docker-compose-kibiter-1        | runtime.newosproc(0xc208020000, 0xc208030000)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/os_linux.c:170 +0x10a
docker-compose-kibiter-1        | newm(0x430700, 0x0)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:1157 +0xc5
docker-compose-kibiter-1        | runtime.newsysmon()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:169 +0x33
docker-compose-kibiter-1        | runtime.onM(0x57cb30)
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:257 +0x68
docker-compose-kibiter-1        | runtime.mstart()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.c:818
docker-compose-kibiter-1        |
docker-compose-kibiter-1        | goroutine 1 [running]:
docker-compose-kibiter-1        | runtime.switchtoM()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:198 fp=0xc20801a798 sp=0xc20801a790
docker-compose-kibiter-1        | runtime.main()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/proc.go:32 +0x58 fp=0xc20801a7e0 sp=0xc20801a798
docker-compose-kibiter-1        | runtime.goexit()
docker-compose-kibiter-1        | 	/usr/src/go/src/runtime/asm_amd64.s:2232 +0x1 fp=0xc20801a7e8 sp=0xc20801a7e0
docker-compose-kibiter-1 exited with code 2
docker-compose-hatstall-1       | Operations to perform:
docker-compose-hatstall-1       |   Apply all migrations: admin, auth, contenttypes, sessions
docker-compose-hatstall-1       | Running migrations:
docker-compose-hatstall-1       |   Applying contenttypes.0001_initial... OK
docker-compose-hatstall-1       |   Applying auth.0001_initial... OK
docker-compose-hatstall-1       |   Applying admin.0001_initial... OK
docker-compose-hatstall-1       |   Applying admin.0002_logentry_remove_auto_add... OK
docker-compose-hatstall-1       |   Applying admin.0003_logentry_add_action_flag_choices... OK
docker-compose-hatstall-1       |   Applying contenttypes.0002_remove_content_type_name... OK
docker-compose-hatstall-1       |   Applying auth.0002_alter_permission_name_max_length... OK
docker-compose-hatstall-1       |   Applying auth.0003_alter_user_email_max_length... OK
docker-compose-hatstall-1       |   Applying auth.0004_alter_user_username_opts... OK
docker-compose-hatstall-1       |   Applying auth.0005_alter_user_last_login_null... OK
docker-compose-hatstall-1       |   Applying auth.0006_require_contenttypes_0002... OK
docker-compose-hatstall-1       |   Applying auth.0007_alter_validators_add_error_messages... OK
docker-compose-hatstall-1       |   Applying auth.0008_alter_user_username_max_length... OK
docker-compose-hatstall-1       |   Applying auth.0009_alter_user_last_name_max_length... OK
docker-compose-hatstall-1       |   Applying auth.0010_alter_group_name_max_length... OK
docker-compose-hatstall-1       |   Applying auth.0011_update_proxy_permissions... OK
docker-compose-hatstall-1       |   Applying auth.0012_alter_user_first_name_max_length... OK
docker-compose-hatstall-1       |   Applying sessions.0001_initial... OK
docker-compose-hatstall-1       |
docker-compose-hatstall-1       | 163 static files copied to '/home/grimoirelab/grimoirelab-hatstall/django-hatstall/static'.
docker-compose-elasticsearch-1  | [2022-11-29T23:24:25,238][WARN ][o.e.b.JNANatives         ] [unknown] unable to install syscall filter:
docker-compose-elasticsearch-1  | java.lang.UnsupportedOperationException: seccomp unavailable: CONFIG_SECCOMP not compiled into kernel, CONFIG_SECCOMP and CONFIG_SECCOMP_FILTER are needed
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.SystemCallFilter.linuxImpl(SystemCallFilter.java:342) ~[elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.SystemCallFilter.init(SystemCallFilter.java:617) ~[elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.JNANatives.tryInstallSystemCallFilter(JNANatives.java:260) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Natives.tryInstallSystemCallFilter(Natives.java:113) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Bootstrap.initializeNatives(Bootstrap.java:108) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:170) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:333) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:159) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:150) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:124) [elasticsearch-cli-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.cli.Command.main(Command.java:90) [elasticsearch-cli-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:116) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | 	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:93) [elasticsearch-6.8.6.jar:6.8.6]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:26,474][INFO ][o.e.e.NodeEnvironment    ] [McrRPTG] using [1] data paths, mounts [[/ (overlay)]], net usable_space [22.6gb], net total_space [58.3gb], types [overlay]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:26,477][INFO ][o.e.e.NodeEnvironment    ] [McrRPTG] heap size [1.9gb], compressed ordinary object pointers [true]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:26,492][INFO ][o.e.n.Node               ] [McrRPTG] node name derived from node ID [McrRPTGtQ2qNNLBkJP3ypw]; set [node.name] to override
docker-compose-elasticsearch-1  | [2022-11-29T23:24:26,494][INFO ][o.e.n.Node               ] [McrRPTG] version[6.8.6], pid[1], build[oss/docker/3d9f765/2019-12-13T17:11:52.013738Z], OS[Linux/5.15.49-linuxkit/amd64], JVM[AdoptOpenJDK/OpenJDK 64-Bit Server VM/13.0.1/13.0.1+9]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:26,494][INFO ][o.e.n.Node               ] [McrRPTG] JVM arguments [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.io.tmpdir=/tmp/elasticsearch-16412085569395981219, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -Djava.locale.providers=COMPAT, -XX:UseAVX=2, -Xms2g, -Xmx2g, -Des.path.home=/usr/share/elasticsearch, -Des.path.conf=/usr/share/elasticsearch/config, -Des.distribution.flavor=oss, -Des.distribution.type=docker]
docker-compose-hatstall-1       | User for django admin created: admin/admin as login
docker-compose-hatstall-1       | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.19.0.4. Set the 'ServerName' directive globally to suppress this message
docker-compose-hatstall-1       | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.19.0.4. Set the 'ServerName' directive globally to suppress this message
docker-compose-hatstall-1       | [Tue Nov 29 23:24:28.008539 2022] [mpm_event:notice] [pid 205:tid 274907454720] AH00489: Apache/2.4.38 (Debian) OpenSSL/1.1.1d mod_wsgi/4.6.5 Python/3.7 configured -- resuming normal operations
docker-compose-hatstall-1       | [Tue Nov 29 23:24:28.013385 2022] [core:notice] [pid 205:tid 274907454720] AH00094: Command line: '/usr/sbin/apache2 -D FOREGROUND'
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,988][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [aggs-matrix-stats]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,990][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [analysis-common]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,990][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [ingest-common]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,991][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [ingest-geoip]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,991][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [ingest-user-agent]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,991][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [lang-expression]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,992][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [lang-mustache]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,992][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [lang-painless]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,993][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [mapper-extras]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,993][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [parent-join]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,993][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [percolator]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,994][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [rank-eval]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,994][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [reindex]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,994][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [repository-url]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,995][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [transport-netty4]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,995][INFO ][o.e.p.PluginsService     ] [McrRPTG] loaded module [tribe]
docker-compose-elasticsearch-1  | [2022-11-29T23:24:30,998][INFO ][o.e.p.PluginsService     ] [McrRPTG] no plugins loaded

@sduenas
Copy link
Member

sduenas commented Nov 30, 2022

I think kibiter, which is a customized version of elasticsearch can't run on a Mac laptop. I would recommend you to use opensearch instead which has images generated for your architecture. Could you try to use this other docker-compose?

@zhquan
Copy link
Member

zhquan commented Nov 30, 2022

You should mount .ssh in mordred, not in kibiter

    mordred:
        volumes:
            - ~/.ssh/:/home/grimoire/.ssh

@loganknecht
Copy link
Author

loganknecht commented Nov 30, 2022

@sduenas - The open search Docker-Compose solution does spin up

CONTAINER ID   IMAGE                                           COMMAND                  CREATED             STATUS                       PORTS                                                                NAMES
ca5db955ca34   grimoirelab/grimoirelab:latest                  "/bin/sh -c ${DEPLOY…"   About an hour ago   Up About an hour (healthy)                                                                        docker-compose-mordred-1
d0642aea295b   grimoirelab/hatstall:latest                     "/bin/sh -c ${DEPLOY…"   About an hour ago   Up About an hour             0.0.0.0:8000->80/tcp                                                 docker-compose-hatstall-1
222720534a24   mariadb:10.6                                    "docker-entrypoint.s…"   About an hour ago   Up About an hour             3306/tcp                                                             docker-compose-mariadb-1
3b8720e600c5   opensearchproject/opensearch:1.3.6              "./opensearch-docker…"   About an hour ago   Up About an hour             0.0.0.0:9200->9200/tcp, 9300/tcp, 0.0.0.0:9600->9600/tcp, 9650/tcp   opensearch-node1
66bcb1eac53b   opensearchproject/opensearch-dashboards:1.3.6   "./opensearch-dashbo…"   About an hour ago   Up About an hour             0.0.0.0:5601->5601/tcp                                               opensearch-dashboards

@zhquan - Thank you for the guidance. My new configuration is as follows

    mordred:
        restart: on-failure:5
        image: grimoirelab/grimoirelab:latest
        volumes:
            - ../default-grimoirelab-settings/setup-opensearch.cfg:/home/grimoire/conf/setup.cfg
            - ../default-grimoirelab-settings/projects.json:/home/grimoire/conf/projects.json
            - ../default-grimoirelab-settings/organizations.json:/home/grimoire/organizations.json
            - ../default-grimoirelab-settings/identities.yml:/home/grimoire/conf/identities.yml
            - /tmp/:/home/grimoire/logs
            - ~/.ssh/:/home/grimoire/.ssh
        depends_on:
            - mariadb
            - opensearch-node1
        mem_limit: 4g

However, it doesn't appear to have any dashboards, and I am under the impression that Gerrit information is not being pulled down into an index?

I can confirm that in the mordred grimoirelab container the .ssh directory has been correctly placed
image

Here is what my index management says:
image

And the dashboard view shows this as well
image

When I try to configure my index I get
image

I see that in the documentation here
https://github.com/chaoss/grimoirelab/tree/master/docker-compose#opensearch

It says

GrimoireLab works with OpenSearch, but panels are not automatically created, but they can be manually imported.

Is there any information or guidance on

  1. How do I confirm that my gerrit configurations are being pulled in and indexed
  2. What will the Gerrit index look like when ingested?
  3. How do I manually import the panels?
  4. Does open search have dashboard parity with kibana

@sduenas
Copy link
Member

sduenas commented Dec 1, 2022

Is there any information or guidance on

1. How do I confirm that my gerrit configurations are being pulled in and indexed

You should check the logs generated by the grimoirelab image to see what's wrong. The platform should have created two indexes with the data but apparently it didn't.
If you used the docker-compose I mentioned in my previous comment, the logs should be in your /tmp/ directory. Check for a file named all.log and try to look inside it for gerrit entries or Traceback strings.

2. What will the Gerrit index look like when ingested?

According to your configuration on the setup.cfg file, there should be two indexes: gerrit_raw and gerrit_enriched

3. How do I manually import the panels?

I think @zhquan can help you with this.

4. Does open search have dashboard parity with kibana

The basic features are practically the same and the usage is the same. After the fork, they are diverging more and more but not much so far.

@loganknecht
Copy link
Author

loganknecht commented Dec 2, 2022

@sduenas Thank you very much for the answers. I am looking forward to hearing the guidance from @zhquan


The first issue I had was that I was using the wrong setup.cfg.
The correct file to use was default-grimoirelab-settings/setup-opensearch.cfg.

When I added the Gerrit configurations to that file it worked.


With regards to your suggestions for checking the logs, it does appear to start up correctly but a connection seems to be unable to be made?

My logs look like this

2022-12-02 00:58:48,791 - perceval.backends.core.gerrit - ERROR - gerrit cmd ssh  -p [PORT_REDACTED] lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED] gerrit  version  failed: Command 'ssh  -p [PORT_REDACTED] lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED] gerrit  version ' returned non-zero exit status 255.
2022-12-02 01:00:15,365 - sirmordred.task_projects - INFO - Reading projects data from  /home/grimoire/conf/projects.json 
2022-12-02 01:00:16,460 - sirmordred.task_identities - INFO - [sortinghat] No changes in file /home/grimoire/organizations.json, organizations won't be loaded
2022-12-02 01:00:16,488 - sirmordred.task_identities - INFO - Loading GrimoireLab identities in SortingHat
2022-12-02 01:00:16,867 - sirmordred.task_identities - INFO - [sortinghat] No changes in file /tmp/tmpnowrlghg, identities won't be loaded
2022-12-02 01:00:16,869 - sirmordred.task_identities - INFO - [sortinghat] End of loading identities from file /tmp/tmpnowrlghg
2022-12-02 01:00:19,603 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm email
2022-12-02 01:00:20,097 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm username
2022-12-02 01:00:20,510 - sirmordred.task_identities - INFO - [sortinghat] Executing affiliate
2022-12-02 01:00:20,880 - sirmordred.task_identities - INFO - [sortinghat] Executing autoprofile for sources: ['github', 'pipermail', 'git']
2022-12-02 01:00:21,232 - sirmordred.task_identities - INFO - [sortinghat] Autogender not configured. Skipping.
2022-12-02 01:00:21,237 - sirmordred.task_manager - INFO - [Global tasks] sleeping for 100 seconds 
2022-12-02 01:00:48,902 - grimoire_elk.elk - ERROR - Error feeding raw from gerrit (https://[HOSTNAME_REDACTED]/[PATH_REDACTED]): ssh  -p [PORT_REDACTED] lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED] gerrit  version  failed 3 times. Giving up!
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/grimoire_elk/elk.py", line 203, in feed_backend
    ocean_backend.feed(**params)
  File "/usr/local/lib/python3.8/site-packages/grimoire_elk/raw/elastic.py", line 234, in feed
    self.feed_items(items)
  File "/usr/local/lib/python3.8/site-packages/grimoire_elk/raw/elastic.py", line 250, in feed_items
    for item in items:
  File "/usr/local/lib/python3.8/site-packages/perceval/backend.py", line 316, in fetch
    for item in self.fetch_items(category, **kwargs):
  File "/usr/local/lib/python3.8/site-packages/perceval/backends/core/gerrit.py", line 122, in fetch_items
    if self.client.version[0] == 2 and self.client.version[1] == 8:
  File "/usr/local/lib/python3.8/site-packages/perceval/backends/core/gerrit.py", line 352, in version
    raw_data = self.__execute(cmd)
  File "/usr/local/lib/python3.8/site-packages/perceval/backends/core/gerrit.py", line 425, in __execute
    response = self.__execute_from_remote(cmd)
  File "/usr/local/lib/python3.8/site-packages/perceval/backends/core/gerrit.py", line 463, in __execute_from_remote
    raise result

And this appears to be repeated for each repo specified in my projects.json file.
The output I'm seeing is like this for each specified repo.

docker-compose-mordred-1   | ssh: Could not resolve hostname lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED] : Name or service not known
docker-compose-mordred-1   | 2022-12-02 01:22:39,864 - perceval.backends.core.gerrit - ERROR - gerrit cmd ssh  -p [PORT_REDACTED] lknecht@https://[REDACTED] gerrit  version  failed: Command 'ssh  -p [PORT_REDACTED] gerrit  version ' returned non-zero exit status 255.

docker-compose-mordred-1   | ssh: Could not resolve hostname [HOSTNAME_REDACTED]/[PATH_REDACTED] : Name or service not known
docker-compose-mordred-1   | 2022-12-02 01:41:51,180 - perceval.backends.core.gerrit - ERROR - gerrit cmd ssh  -p [PORT_REDACTED] lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED]  gerrit  version  failed: Command 'ssh  -p [PORT_REDACTED] lknecht@https://[HOSTNAME_REDACTED]/[PATH_REDACTED]  gerrit  version ' returned non-zero exit status 255.

I can't tell if I put the wrong urls in the projects.json. It looks like this

{
  "lknecht": {
    "meta": {
      "title": "Gerrit Test"
    },
    "gerrit": [
      // ...
      "https://[HOST_REDACTED]/[PATH_REDACTED]"
      // ...
    ]
  }
}

I tried changing the url to ssh:// and that didn't work either. I also tried removing the protocol prefix as well, and didn't have any luck there either.

However, I connected to the container directly and was able to clone it without issue.

grimoire@a1437ee81b8d:~$ cd /tmp/
grimoire@a1437ee81b8d:/tmp$ git clone "ssh://lknecht@[HOST_REDACTED]:[PORT_REDACTED]/[PATH_REDACTED]"
Cloning into '[REDACTED]'...
warning: You appear to have cloned an empty repository.
grimoire@a1437ee81b8d:/tmp$

I can confirm that my SSH key is correctly mounted on the volume as well.
image

Do you know why I'm receiving that message, but still able to clone the repo?
Am I doing something wrong with the Gerrit configuration?

@sduenas
Copy link
Member

sduenas commented Dec 2, 2022

Can you connect to the container and run perceval command following this help ?

@zhquan
Copy link
Member

zhquan commented Dec 2, 2022

gerrit

Try first ssh -p 29418 lknecht@[HOSTNAME_REDACTED] on your mordred container to accept the authenticity, this is only needed the very first time.

Remember that for this https://review.opendev.org/ gerrit instance

  • projects.json is review.opendev.org
  • setup.cfg add the user user = USER

The complete example at https://github.com/chaoss/grimoirelab-sirmordred#gerrit-

If you want to make sure that your configuration is correct you can check with perceval as @sduenas said

  1. Enter to mordred instance
  2. ssh -p 29418 lknecht@[HOSTNAME_REDACTED] (if it is the first time)
  3. perceval gerrit HOSTNAME --user USER

The HOSTNAME should be in the projects.json and the USER in the setup.cfg

For the https://review.opendev.org example, the command should be

ssh -p 29418 [email protected]
perceval gerrit review.opendev.org --user USER

dashboards

To upload dashboards and index patterns to OpenDistro/OpenSearch chaoss/grimoirelab-kidash#40

  1. On the setup.cfg disable panels (panels = false) in the section phases.
  2. Use kidash to import the panels, index patterns stored in sigils but you must use these repos (you only have to do this the first time):
    2.1. https://github.com/zhquan/grimoirelab-kidash/tree/opendistro
    2.2. https://github.com/zhquan/grimoirelab-sigils/tree/opendistro/json

i.e for the dashboard git:

cd grimoirelab-sigils/json
kidash -e https://admin:admin@localhost:9200 --import git-index-pattern.json
kidash -e https://admin:admin@localhost:9200 --import git.json

# If your .kibana is not `.kibana` you have to add your admin/custom .kibana
kidash -e https://admin:admin@localhost:9200 --import git-index-pattern.json --kibana .kibana_xxx_admin_x
kidash -e https://admin:admin@localhost:9200 --import git.json --kibana .kibana_xxx_admin_x

To know it run GET _/cat/indices on OpenSearch and you will see all indices.

Or if you have an old Kibiter instance you can export the objects and import them (maybe you have to resize some visualizations)
Settings -> Management -> Saved Objects -> Export XXXX objects (on the top left)

@loganknecht
Copy link
Author

loganknecht commented Dec 3, 2022

@zhquan Thank you so much for the response!

RE: Setup

It looks like I'm now seeing the indexes, and your suggestion about targeting the endpoint worked as well.
image

I also successfully SSHed and am in the process of running perceval which looks like it's successfully running? And I believe that's going to be what is populating the indexes?

Open Search Indexes - Does Appear to Be Running

I made a mistake with my initial projects.json
In it I had listed each repo individually, but for Gerrit this is what it took

{
    "lknecht": {
        "meta": {
            "title": "Gerrit Test"
        },
        "gerrit": ["gerrit.example.info"]
    }
}

After that it appeared to be working.


At the moment if I don't run perceval it doesn't seem to be increasing the size of the idices. And even with perceval running rignt now it doesn't look like the size is increasing.

Is there something that is supposed to run automatically instead of me connecting to the container and initiating perceval?
Am I misunderstanding how indices are populated?

Open Search Configurations - Missing Sections

I had several issues occur when trying to get the indices running.

First was that I got several errors stating the configuration required missing sections.

docker-compose-mordred-1   | 2022-12-02 23:18:30,620 - sirmordred.task_enrich - ERROR - Missing config for study enrich_demography_contribution:gerrit:
docker-compose-mordred-1   | 2022-12-02 23:18:30,624 - sirmordred.task_manager - ERROR - [gerrit] Exception in Task Manager Missing config for study enrich_demography_contribution:gerrit:

opensearch-node1           | [2022-12-02T23:32:22,448][ERROR][o.o.i.i.ManagedIndexCoordinator] [opensearch-node1] get managed-index failed: [.opendistro-ism-config] IndexNotFoundException[no such index [.opendistro-ism-config]]
opensearch-node1           | [2022-12-02T23:32:22,472][INFO ][o.o.c.m.MetadataCreateIndexService] [opensearch-node1] [gerrit-onion_enriched] creating index, cause [api], templates [], shards [1]/[1]

To fix this I had to uncomment the sections that were listed as optional in the setup-opensearch.cfg

[enrich_demography:gerrit]

[enrich_onion:gerrit]
in_index = gerrit_enriched
out_index = gerrit-onion_enriched

[enrich_demography_contribution:gerrit]
date_field = grimoire_creation_date
author_field = author_uuid

This was where I got the initial configurations
https://github.com/chaoss/grimoirelab/blob/master/default-grimoirelab-settings/setup-opensearch.cfg

Open Search Configurations - opendistro-ism-config Error

opensearch-node1           | [2022-12-03T00:36:12,388][ERROR][o.o.i.i.ManagedIndexCoordinator] [opensearch-node1] get managed-index failed: [.opendistro-ism-config] IndexNotFoundException[no such index [.opendistro-ism-config]]

I have no idea why this is happening. I tried to add the section [.opendistro-ism-config] to the setup-openconfig.cfg and that didn't work.

I have also searched across the code base and am not seeing that .opendistro-ism-config listed anywhere.

Do you know what I can do to resolve this?

Open Search Configurations - demo_sh Error Reading Communication Packets

docker-compose-mariadb-1   | 2022-12-03  0:02:04 10 [Warning] Aborted connection 10 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:05 11 [Warning] Aborted connection 11 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:05 12 [Warning] Aborted connection 12 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:07 13 [Warning] Aborted connection 13 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:07 14 [Warning] Aborted connection 14 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:08 15 [Warning] Aborted connection 15 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-03  0:02:08 16 [Warning] Aborted connection 16 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-02 23:34:24 42 [Warning] Aborted connection 42 to db: 'unconnected' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-02 23:34:24 51 [Warning] Aborted connection 51 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)

I have no idea what this means.

The sortinghat config is where this appears to be used

[sortinghat]
host = mariadb
user = root
password = password
database = demo_sh
load_orgs = true
orgs_file = /home/grimoire/organizations.json
autoprofile = [github, pipermail, git]
matching = [email,username]
sleep_for = 100
unaffiliated_group = Unknown
affiliate = true
strict_mapping = false
reset_on_load = false
identities_file = [/home/grimoire/conf/identities.yml]
identities_format = grimoirelab

The mariadb service in the docker-compose-opensearch.yaml is configured like so

    mariadb:
        image: mariadb:10.6
        expose:
            - "3306"
        environment:
            - MYSQL_ROOT_PASSWORD=password
            - MYSQL_ALLOW_EMPTY_PASSWORD=yes

I'm not sure if this is an authentication issue or something else.

Do you know why this is happening and/or what to fix to resolve this?

Dashboards

I'm getting a Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana error when I try to use kidash to import the dashboard.

I ran the following steps

cd /tmp

pip3 install kidash

git clone https://github.com/zhquan/grimoirelab-sigils.git
cd grimoirelab-sigils/json

grimoire@c726aec5b329:/tmp/grimoirelab-sigils/json$ kidash --elastic_url https://admin:admin@localhost:9200 --import git.json
2022-12-03 00:12:00,334 Retrying (Retry(total=20, connect=20, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffffa45ead60>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-03 00:12:00,735 Retrying (Retry(total=19, connect=19, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffffa53f8700>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-03 00:12:01,536 Retrying (Retry(total=18, connect=18, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffffa4645c70>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-03 00:12:03,141 Retrying (Retry(total=17, connect=17, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffffa4645a00>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana

I see that this is configured in the docker-compose-opensearch.yml's opensearch-dashboards service

    opensearch-dashboards:
        image: opensearchproject/opensearch-dashboards:1.3.6
        container_name: opensearch-dashboards
        ports:
            - 5601:5601
        expose:
            - "5601"
        environment:
            OPENSEARCH_HOSTS: '["https://opensearch-node1:9200"]'

So I changed the command to target the host specified and it works, but fails.

grimoire@ccce27d802bc:/tmp/grimoirelab-sigils/json$ kidash --elastic_url https://admin:admin@opensearch-node1:9200 --import git.json
2022-12-03 00:50:51,317 400 Client Error: Bad Request for url: https://admin:admin@opensearch-node1:9200/.kibana/dashboard/Git. Content: b'{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Rejecting mapping update to [.kibana_1] as the final mapping would have more than 1 type: [_doc, dashboard]"}],"type":"illegal_argument_exception","reason":"Rejecting mapping update to [.kibana_1] as the final mapping would have more than 1 type: [_doc, dashboard]"},"status":400}'

Is there something I should be doing differently here?

Gratitude

Thank you so much for everyone in helping me through these hiccups?
I greatly appreciate all your guidance.

@zhquan
Copy link
Member

zhquan commented Dec 5, 2022

Gerrit indices

There are no documents in your gerrit indices. So the collection phase is not working correctly, check your mordred logs (all.log) and search by gerrit.

Your setup should be like

[gerrit]
raw_index = gerrit_raw
enriched_index = gerrit_enriched
user = lknecht
no-archive = true

Dashboards

You must use these versions to upload dashboards and index patterns to an OpenSearch instance.

Install kidash

cd /tmp
git clone https://github.com/zhquan/grimoirelab-kidash
cd grimoirelab-kidash
git checkout opendistro
python3 setup.py install

Clone sigils

cd /tmp
git clone https://github.com/zhquan/grimoirelab-sigils
cd grimoirelab-sigils
git checkout opendistro

Run kidash

cd /tmp/grimoirelab-sigils/json
kidash -e https://admin:admin@localhost:9200 --import gerrit-index-pattern.json
kidash -e https://admin:admin@localhost:9200 --import gerrit.json

# If your .kibana is not `.kibana` you have to add your admin/custom .kibana
kidash -e https://admin:admin@localhost:9200 --import gerrit-index-pattern.json --kibana .kibana_xxx_admin_x
kidash -e https://admin:admin@localhost:9200 --import gerrit.json --kibana .kibana_xxx_admin_x

@loganknecht
Copy link
Author

loganknecht commented Dec 5, 2022

Hey @zhquan

Gerrit Ingestion

I ended up getting my Gerrit ingestion working by using the suggestion you provided with this configuration in my default-grimoirelab-settings/setup-opensearch.cfg

# ------------------------------------------------------------------------------
# Gerrit
# ------------------------------------------------------------------------------
[gerrit]
raw_index = gerrit_raw
enriched_index = gerrit_enriched
user = lknecht
no-archive = true
# blacklist-ids = []
# max-reviews = 500
# studies = [enrich_demography:gerrit, enrich_onion:gerrit, enrich_demography_contribution:gerrit]

[enrich_demography:gerrit]

[enrich_onion:gerrit]
in_index = gerrit_enriched
out_index = gerrit-onion_enriched

[enrich_demography_contribution:gerrit]
date_field = grimoire_creation_date
author_field = author_uuid
# ------------------------------------------------------------------------------

image

My /tmp/all.log looks like this now

2022-12-05 22:02:14,936 - sirmordred.sirmordred - INFO - 
2022-12-05 22:02:14,938 - sirmordred.sirmordred - INFO - ----------------------------
2022-12-05 22:02:14,941 - sirmordred.sirmordred - INFO - Starting SirMordred engine ...
2022-12-05 22:02:14,942 - sirmordred.sirmordred - INFO - - - - - - - - - - - - - - - 
2022-12-05 22:02:14,951 - urllib3.connectionpool - WARNING - Retrying (Retry(total=20, connect=11, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e262f70>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:15,360 - urllib3.connectionpool - WARNING - Retrying (Retry(total=19, connect=10, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e262df0>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:16,165 - urllib3.connectionpool - WARNING - Retrying (Retry(total=18, connect=9, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e262c70>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:17,773 - urllib3.connectionpool - WARNING - Retrying (Retry(total=17, connect=8, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e262730>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:21,002 - urllib3.connectionpool - WARNING - Retrying (Retry(total=16, connect=7, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e2624f0>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:27,417 - urllib3.connectionpool - WARNING - Retrying (Retry(total=15, connect=6, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e2622b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:40,234 - urllib3.connectionpool - WARNING - Retrying (Retry(total=14, connect=5, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0xffff7e254fd0>: Failed to establish a new connection: [Errno 111] Connection refused')': /
2022-12-05 22:02:41,709 - sirmordred.sirmordred - INFO - Loading projects
2022-12-05 22:02:42,713 - sirmordred.task_projects - INFO - Reading projects data from  /home/grimoire/conf/projects.json 
2022-12-05 22:02:42,731 - sirmordred.sirmordred - INFO - Projects loaded
2022-12-05 22:02:42,735 - sirmordred.sirmordred - INFO - TaskProjects TaskIdentitiesLoad TaskIdentitiesMerge TaskIdentitiesExport will be executed on Mon, 05 Dec 2022 22:04:22 
2022-12-05 22:02:43,751 - sirmordred.task_collection - INFO - [gerrit] collection phase starts
2022-12-05 22:02:43,752 - sirmordred.task_projects - INFO - Reading projects data from  /home/grimoire/conf/projects.json 
2022-12-05 22:02:43,754 - sirmordred.task_collection - INFO - [gerrit] collection starts for [REDACTED]
2022-12-05 22:02:43,991 - grimoire_elk.elastic - INFO - Created index https://opensearch-node1:9200/gerrit_raw
2022-12-05 22:02:44,195 - grimoire_elk.elastic - INFO - Alias {'alias': 'gerrit-raw', 'index': 'gerrit_raw'} created on https://opensearch-node1:9200/gerrit_raw.
2022-12-05 22:02:44,281 - grimoire_elk.raw.elastic - INFO - [gerrit] Incremental from: None until None for [REDACTED]
2022-12-05 22:02:44,837 - sirmordred.task_identities - INFO - [sortinghat] Loading orgs from file /home/grimoire/organizations.json
2022-12-05 22:02:44,850 - sortinghat.command - INFO - Database demo_sh:mariadb None set
2022-12-05 22:02:53,551 - sirmordred.task_identities - INFO - [sortinghat] 1632 organizations loaded
2022-12-05 22:02:53,553 - sirmordred.task_identities - INFO - Loading GrimoireLab identities in SortingHat
2022-12-05 22:02:53,737 - sirmordred.task_identities - INFO - [sortinghat] Loading identities from file /tmp/tmp5l6befky
2022-12-05 22:02:53,752 - sortinghat.command - INFO - Database demo_sh:mariadb None set
2022-12-05 22:02:53,888 - sirmordred.task_identities - INFO - [sortinghat] End of loading identities from file /tmp/tmp5l6befky
2022-12-05 22:02:56,253 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm email
2022-12-05 22:02:56,746 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm username
2022-12-05 22:02:57,240 - sirmordred.task_identities - INFO - [sortinghat] Executing affiliate
2022-12-05 22:02:57,622 - sirmordred.task_identities - INFO - [sortinghat] Executing autoprofile for sources: ['github', 'pipermail', 'git']
2022-12-05 22:02:58,027 - sirmordred.task_identities - INFO - [sortinghat] Autogender not configured. Skipping.
2022-12-05 22:02:58,029 - sirmordred.task_manager - INFO - [Global tasks] sleeping for 100 seconds 
2022-12-05 22:04:39,141 - sirmordred.task_projects - INFO - Reading projects data from  /home/grimoire/conf/projects.json 
2022-12-05 22:04:40,212 - sirmordred.task_identities - INFO - [sortinghat] No changes in file /home/grimoire/organizations.json, organizations won't be loaded
2022-12-05 22:04:40,292 - sirmordred.task_identities - INFO - Loading GrimoireLab identities in SortingHat
2022-12-05 22:04:40,541 - sirmordred.task_identities - INFO - [sortinghat] No changes in file /tmp/tmp5ag3_0pq, identities won't be loaded
2022-12-05 22:04:40,543 - sirmordred.task_identities - INFO - [sortinghat] End of loading identities from file /tmp/tmp5ag3_0pq
2022-12-05 22:04:42,995 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm email
2022-12-05 22:04:43,437 - sirmordred.task_identities - INFO - [sortinghat] Unifying identities using algorithm username
2022-12-05 22:04:43,910 - sirmordred.task_identities - INFO - [sortinghat] Executing affiliate
2022-12-05 22:04:44,298 - sirmordred.task_identities - INFO - [sortinghat] Executing autoprofile for sources: ['github', 'pipermail', 'git']
2022-12-05 22:04:44,605 - sirmordred.task_identities - INFO - [sortinghat] Autogender not configured. Skipping.
2022-12-05 22:04:44,607 - sirmordred.task_manager - INFO - [Global tasks] sleeping for 100 seconds 

However I am still seeing this show up in the Docker Compose run-time, but it doesn't seem to be doing anything negative to the ingestion.

docker-compose-mariadb-1   | 2022-12-05 22:27:43 122 [Warning] Aborted connection 122 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:43 123 [Warning] Aborted connection 123 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:44 124 [Warning] Aborted connection 124 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:45 125 [Warning] Aborted connection 125 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:46 126 [Warning] Aborted connection 126 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:46 127 [Warning] Aborted connection 127 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)
docker-compose-mariadb-1   | 2022-12-05 22:27:46 128 [Warning] Aborted connection 128 to db: 'demo_sh' user: 'root' host: '172.19.0.6' (Got an error reading communication packets)

image

Dashboard Migration

I followed you steps listed above, but I'm still getting the same errors

cd /tmp
git clone https://github.com/zhquan/grimoirelab-kidash

cd /tmp/grimoirelab-kidash
git checkout opendistro
sudo python3 setup.py install

cd /tmp
git clone https://github.com/zhquan/grimoirelab-sigils
cd grimoirelab-sigils
git checkout opendistro
cd /tmp/grimoirelab-sigils/json

kidash -e http://admin:admin@localhost:9200 --import gerrit-index-pattern.json

grimoire@dcb689f34556:/tmp/grimoirelab-sigils/json$ kidash -e http://admin:admin@localhost:9200 --import gerrit-index-pattern.json
2022-12-05 22:35:56,820 Retrying (Retry(total=20, connect=20, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffffb0dc1bb0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-05 22:35:57,221 Retrying (Retry(total=19, connect=19, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffffb0c2cdf0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-05 22:35:58,023 Retrying (Retry(total=18, connect=18, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffffb0c2cac0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-05 22:35:59,626 Retrying (Retry(total=17, connect=17, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffffb0c2c6d0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
...

I am led to believe that this is not finding the correct url resource with the kidash tool.

What resource should I be targeting?

Do you have any suggestions for what to do to get the dashboards migrated?

@loganknecht
Copy link
Author

@zhquan I have again tried with no luck. Still the same issue

grimoire@4c010d1aae24:/tmp/grimoirelab-sigils/json$ kidash -e http://admin:admin@localhost:9200 --import gerrit-index-pattern.json
2022-12-06 20:18:29,900 Retrying (Retry(total=20, connect=20, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff8e232b50>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-06 20:18:30,302 Retrying (Retry(total=19, connect=19, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff8e2fdee0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-06 20:18:31,102 Retrying (Retry(total=18, connect=18, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff8e2fdbb0>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana
2022-12-06 20:18:32,704 Retrying (Retry(total=17, connect=17, read=8, redirect=5, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff8e2fd610>: Failed to establish a new connection: [Errno 111] Connection refused')': /.kibana

@zhquan
Copy link
Member

zhquan commented Dec 7, 2022

Gerrit

2022-12-05 22:02:43,751 - sirmordred.task_collection - INFO - [gerrit] collection phase starts

Now you have 3000 documents in your gerrit_raw index. You have to wait and let Mordred finish the collection phase and it will start with the enrichment phase.

Dashboard

I updated the opendistro branch could you try again?

Clone kidash again or update the opendistro branch and install.

kidash -e https://admin:admin@localhost:9200 --import gerrit-index-pattern.json
2022-12-07 12:36:33,638 Index pattern gerrit from ['gerrit'] imported

kidash -e https://admin:admin@localhost:9200 --import gerrit.json
2022-12-07 12:36:41,637 Dashboard Gerrit imported

This is my OpenSearch

curl https://admin:admin@localhost:9200 -k
{
  "name" : "opensearch",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "6qYHX2MdRm6-z_4pAUGI-A",
  "version" : {
    "distribution" : "opensearch",
    "number" : "1.3.2",
    "build_type" : "tar",
    "build_hash" : "6febcf7b53ff189de767e460e905e9e5aeecc8cb",
    "build_date" : "2022-05-04T03:58:48.328641Z",
    "build_snapshot" : false,
    "lucene_version" : "8.10.1",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "The OpenSearch Project: https://opensearch.org/"
}

@loganknecht
Copy link
Author

Gratitude

@sduenas @zhquan and all - Thank you so much for all your help and patience. I have managed to successfully get this working using the Open Search Dashboards configurations.

For everyone else, if you're using a Macbook M1 chipset or having issues with the default approach and want to try using Open Search Dashboards here are the steps to do so


Steps to Use Open Search Dashboards

  1. Clone the repo
    • git clone https://github.com/chaoss/grimoirelab.git
  2. Configure your default-grimoirelab-settings/projects.json file
    {
        "PROJECT_NAME_CHANGE_IT_TO_WHAT_YOU_WANT": {
            "meta": {
                "title": "Gerrit"
            },
            "gerrit": ["GERRIT_HOSTNAME.com"]
        }
    }
  3. Add your Gerrit configurations to the bottom of your default-grimoirelab-settings/setup-opensearch.cfg file
    # ------------------------------------------------------------------------------
    # Gerrit
    # ------------------------------------------------------------------------------
    [gerrit]
    raw_index = gerrit_raw
    enriched_index = gerrit_enriched
    user = YOUR_GERRIT_USERNAME_CHANGE_THIS
    no-archive = true
    
    [enrich_demography:gerrit]
    
    [enrich_onion:gerrit]
    in_index = gerrit_enriched
    out_index = gerrit-onion_enriched
    
    [enrich_demography_contribution:gerrit]
    date_field = grimoire_creation_date
    author_field = author_uuid
    # ------------------------------------------------------------------------------
  4. Add Gerrit to your [sortinghat]'s autoprofile in the default-grimoirelab-settings/setup-opensearch.cfg file
    [sortinghat]
    host = mariadb
    user = root
    password =
    database = demo_sh
    load_orgs = true
    orgs_file = /home/grimoire/organizations.json
    autoprofile = [gerrit, github, pipermail, git] # THIS IS THE ONLY CHANGE MADE
    matching = [email,username]
    sleep_for = 100
    unaffiliated_group = Unknown
    affiliate = true
    strict_mapping = false
    reset_on_load = false
    identities_file = [/home/grimoire/conf/identities.yml]
    identities_format = grimoirelab
    
  5. Log-in to Gerrit and Configure Your SSH key

Quick side-tangent, you're going to want to make sure your ~/.ssh/config file is correctly formatted.

When I originally tried this I encountered an error because the ~/.ssh/config file looked like this

Host *
  AddKeysToAgent yes
  UseKeychain yes
  IdentityFile ~/.ssh/id_rsa

The UseKeychain property isn't allowed, so you have to remove it.
Your ~/.ssh/config should look something like this

Host *
  AddKeysToAgent yes
  IdentityFile ~/.ssh/id_rsa

  1. Configure the docker-compose/docker-compose-opensearch.yml Mordred service to mount your .ssh folder so that it can authenticate with Gerrit
    mordred:
      restart: on-failure:5
      image: grimoirelab/grimoirelab:latest
      volumes:
        - ~/.ssh/:/home/grimoire/.ssh # THIS IS THE LINE YOU ADD
        - ../default-grimoirelab-settings/setup-opensearch.cfg:/home/grimoire/conf/setup.cfg
        - ../default-grimoirelab-settings/projects.json:/home/grimoire/conf/projects.json
        - ../default-grimoirelab-settings/organizations.json:/home/grimoire/organizations.json
        - ../default-grimoirelab-settings/identities.yml:/home/grimoire/conf/identities.yml
        - /tmp/:/home/grimoire/logs
      depends_on:
        - mariadb
        - opensearch-node1
      mem_limit: 4g
  2. Run the Open Search Docker Compose file
    • cd docker-compose/
    • docker-compose --file docker-compose-opensearch.yml up

At this point it's running in an attached state so you should see the output from the Docker Compose in your shell.


Another quick side tangent.

In the file docker-compose/docker-compose-opensearch.yml there is the Mordred service with the volume /tmp/:/home/grimoire/logs mounted.

What this does is mount the service's /home/girmoire/logs directory to your machine's /tmp directory.

So if you're having issues you can open the file /tmp/all.log on your local machine and see the logs that these services are creating.


The next steps are to get the dashboards migrated to Open Search

  1. In a separate terminal session find the Docker Container for grimoirelab
    • docker ps -a | grep grimoirelab
    • Copy the container id from grimoirelab/grimoirelab:latest

It should look something like this

~/Repositories/grimoirelab   master ±⬡ v16.15.1 docker ps -a | grep grimoirelab
48016b559f11   grimoirelab/grimoirelab:latest                  "/bin/sh -c ${DEPLOY…"   2 days ago     Up 24 hours (healthy)                                                                        docker-compose-mordred-1
d842468f6e28   grimoirelab/hatstall:latest                     "/bin/sh -c ${DEPLOY…"   2 days ago     Up 24 hours             0.0.0.0:8000->80/tcp                                                 docker-compose-hatstall-1
  1. Connect to the container
    • docker exec -it 48016b559f11 /bin/bash

Now you're going to want to follow these steps to get the dashboards migrated and installed

# Install the Kidash tool - needs to be installed as it's a custom branch
cd /tmp
git clone https://github.com/zhquan/grimoirelab-kidash

cd /tmp/grimoirelab-kidash
git checkout opendistro
sudo python3 setup.py install

# Install sigils - no idea, but this is where dashboards are
cd /tmp
git clone https://github.com/zhquan/grimoirelab-sigils
cd grimoirelab-sigils
git checkout opendistro
cd /tmp/grimoirelab-sigils/json

# Import Dashboards
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit_approvals.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit_backlog.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit_efficiency.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit-index-pattern.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit_retention_newcomers.json
kidash -e https://admin:admin@opensearch-node1:9200 --import gerrit_timing.json

@zhquan
Copy link
Member

zhquan commented Dec 13, 2022

@loganknecht very nice summary, thank you very much.

You don't need to install kidash in Mordred (grimoirelab) you can install it locally. This way you don't need to install kidash every time you recreate the Mordred container.

If the kidash is installed locally remember to update the OpenSearch endpoint to https://admin:admin@localhost:9200, that endpoint is only visible if you are in the grimoirelab containers.

I close the issue, feel free to reopen it if needed.

@zhquan zhquan closed this as completed Dec 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants