Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leaks #1039

Closed
majid1605 opened this issue Feb 13, 2023 · 9 comments
Closed

memory leaks #1039

majid1605 opened this issue Feb 13, 2023 · 9 comments
Labels

Comments

@majid1605
Copy link

majid1605 commented Feb 13, 2023

Describe the bug
Hello
I just got acquainted with your database. This is a very fast database.
Unfortunately, I couldn't find any good information about authentication in the documentation.
But the main problem is about memory:
1- I entered the number of 3488195 records into the database using 4 sql files, and after doing this, the amount of memory reached nearly 1G. I had to reset the container to free it, and after I did this, the amount of memory was 160 MB.
2- After restarting and without running a query or using the database, the memory volume reached 490 MB after about 6 hours.
I have been testing this for the past two days and the memory is increasing all the time.
What is the reason for this?

Describe the environment:

  • docker image manticoresearch/manticore:6.0.2
  • OS version arch linux:

Additional context
sample data :

   
INSERT INTO trades_sample(ts,symbol,open,high,low,close,volume) VALUES    
 ('2020-08-13 04:29:59 04:30','LTC',54.45000076293945,54.54999923706055,54.41999816894531,54.52000045776367,893),    
 ('2020-08-13 04:29:59 04:30','SOL',3.7558000087738037,3.7558000087738037,3.7558000087738037,3.7558000087738037,45),    
 ('2020-08-20 04:29:00 04:30','BNB',22.32900047302246,22.38789939880371,22.31399917602539,22.376100540161133,1690)    
...........    
@tomatolog
Copy link
Contributor

after posting your data you could issue

FLUSH RAMCHUNK index-name

as described

that creates disk chunk for RAM data collected so far and free some memory

@sanikolaev
Copy link
Collaborator

Unfortunately, I couldn't find any good information about authentication in the documentation.

Manticore doesn't yet provide HTTP or mysql auth out of the box, but you can easily put something light in front of it for that. For example, here's how easy it can be done with docker-compose:

version: '2.2'
services:
  manticore:
    container_name: manticore
    image: manticoresearch/manticore:latest
    volumes:
      - ./data:/var/lib/manticore
      restart: always
      ulimits:
        nproc: 65535
        nofile:
           soft: 65535
           hard: 65535
        memlock:
          soft: -1
          hard: -1
  auth:
    container_name: auth
    image: quay.io/dtan4/nginx-basic-auth-proxy:latest
    environment:
      - BASIC_AUTH_USERNAME=username
      - BASIC_AUTH_PASSWORD=changeme
      - PROXY_PASS=http://manticore:9308
      - SERVER_NAME=localhost
      - PORT=80
    ports:
      - "127.0.0.1:80:80"

Or in case you need HTTPS and want to integrate it into an existing web server you can expose it on another port and another domain, e.g.:

auth:
...
environment:
...
  - SERVER_NAME=manticore.yourdomain.com
...
  ports:
    - "127.0.0.1:8008:80"

and then in your webserver's config just setup proxying to the docker, for example:

server {
    listen 443 ssl;
    server_name manticore.yourdomain.com;

    http2_idle_timeout 5m; # up from 3m default

    location / {
        proxy_pass http://localhost:8008/;
        proxy_set_header Host $http_host;
        proxy_http_version 1.1;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto https;
    }

#    ssl_certificate /path/to/manticore.yourdomain.com/fullchain.pem;
#    ssl_certificate_key /path/to/manticore.yourdomain.com/privkey.pem;

}

server {
    server_name manticore.yourdomain.com;

    if ($host = manticore.yourdomain.com) {
        return 301 https://$host$request_uri;
    }

    listen 80;
    return 404;
}

Then:

  • either use an existing certificate (then uncomment the ssl* lines above)
  • or use certbot (e.g. run certbot --nginx) to prepare the certificates for you

I have been testing this for the past two days and the memory is increasing all the time.
What is the reason for this?

Multiple reasons:

  • just table RAM chunk and you may want to lower rt_mem_limit or FLUSH RAMCHUNK
  • you are using the row-wise storage and there are table files that have to be in RAM for optimal performance

You may also want to read this https://manual.manticoresearch.com/Creating_a_table/Local_tables/Plain_and_real-time_table_settings#Accessing-table-files to learn more about what table files can be kept on disk / in memory and how to do it.

@sanikolaev
Copy link
Collaborator

I have been testing this for the past two days and the memory is increasing all the time.

To understand if it's ok or not please run FLUSH RTINDEX and then provide:

  • ls -la of your table files
  • ps aux|grep searchd
  • sudo pmap -px <searchd's pid>

@sanikolaev sanikolaev added the waiting Waiting for the original poster (in most cases) or something else label Feb 14, 2023
@majid1605
Copy link
Author

majid1605 commented Feb 14, 2023

To understand if it's ok or not please run FLUSH RTINDEX and then provide:

The results before and after the Flush command:
Container start time:
20230214_150509

Screenshot_20230214_150509

After three hours:
20230214_175540

Screenshot_20230214_175540

after run command:
Screenshot_20230214_181040

After four hours the memory is increasing again
20230214_221306
Screenshot_20230214_221306

@sanikolaev
Copy link
Collaborator

Please provide the pmap output

@githubmanticore
Copy link
Contributor

➤ Sergey Nikolaev commented:

Please provide the pmap output

Never mind, we've reproduced the issue.

@githubmanticore githubmanticore removed the waiting Waiting for the original poster (in most cases) or something else label Feb 16, 2023
@githubmanticore
Copy link
Contributor

githubmanticore commented Feb 17, 2023

➤ Stan commented:

should be fixed at 0fde0b5 fixed memory leaks on reading console of the buddy process

@tomatolog
Copy link
Contributor

could you try package with these numbers in name 0fde0b5 from the dev repo after CI will pass and publish packages?

CI usually takes 2 hours to finish

Some details on how to get the package from dev repo:
https://manual.manticoresearch.com/Installation/RHEL_and_Centos#Development-packages

@sanikolaev
Copy link
Collaborator

The has been fixed and released in Manticore 6.0.4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants