Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Persistent Storage rebuilding creating a new volume everytime it restarts causing loss in data #5099

Closed
shaneth4312 opened this issue Feb 9, 2025 · 3 comments

Comments

@shaneth4312
Copy link

Error Message and Logs

So since the last update we had some issues with databases not storing data correctly. When making edits in a CMS that stores the data in a database, the next day the data would be gone.

The containers do a restart each morning and have persistant volumes attached to them and we've been running this set up for around a year now.

I have spent the weekend rebuilding all our servers in hopes that it doesn't replicate but it seems to still be an issue which makes me think its something to do with our Coolify main server which has been effected since the update.

It's worth noting that just before this issue we hit an issue with docker rate limits which stopped all our dockers from running and we had to run docker login on all our servers and restart them all for the 500 error to disapear and re-build all our containers manually (around 70 in total). It's happening on Postgres, MariaDB and MySQL (theyre the ones we use).

An exmaple is "Wordpress 01" and "Database 01" are connected, Wordpress 01 sends it data to Database 01 and the next morning after the restart, that data is gone. After some testing, when we restart a database, the storage rebuilds and it creates a brand new volume. Something it didn't do before.

I have been able to note the steps below and have found a work around:

  • After upgrade from 389 to 390 the containers wipe all data on persistent storage and create new ones (could also be bug related to the rate limit)
  • Creating a new server and cloning the resource over to the new server carries over the issue and containers rebuild despite been persistent.

Temp Fix:

  • Backup database or persistent storage
  • Create new Server
  • Create brand new resource
  • Import backup (or copy data of persistent storage)

While I call this a fix, it definitely is not ideal for 70+ services

Steps to Reproduce

  1. Coolify running on 389
  2. Have a Persistent Storage running (Poergres 16)
  3. Upgrade to 390
  4. Change some data in Postgres 16
  5. Restart server running Postgres 16
  6. Data wipes

Example Repository URL

No response

Coolify Version

v4.0.0-beta.390

Are you using Coolify Cloud?

No (self-hosted)

Operating System and Version (self-hosted)

Debian 12.8

Additional Information

Worth noting that after the upgrade to 390 we for some reason hit dockers "Rate Limit" despite not having anymore resources. This caused all 70+ dockers to error and coolify constant hit 500 error screens. We had to stop all servers and run docker login and restart them all to remove this screen. The upgrade and the Docker rate error happening simutainiously so it could be linked.

@shaneth4312 shaneth4312 added 🐛 Bug Reported issues that need to be reproduced by the team. 🔍 Triage Issues that need assessment and prioritization. labels Feb 9, 2025
@peaklabs-dev
Copy link
Member

Do you have any file mounts in addition to the persistent volume?

@peaklabs-dev peaklabs-dev added 💤 Waiting for feedback Issues awaiting a response from the author. and removed 🔍 Triage Issues that need assessment and prioritization. labels Feb 10, 2025
@shaneth4312
Copy link
Author

Do you have any file mounts in addition to the persistent volume?

Hey,

Yes, we have files mounts too.

Thank you,
Shane

@peaklabs-dev
Copy link
Member

Then that is the problem. It will be fixed via this PR. #5027

@github-actions github-actions bot removed 🐛 Bug Reported issues that need to be reproduced by the team. 💤 Waiting for feedback Issues awaiting a response from the author. labels Feb 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants