Veronica Charlton's picture


Initially when I install the appliance it boots and runs.


I use turnkey-init to setup the information needed for gitlab.


Gitlab runs but when I try to login on the web interface I get a 500 error.


If I reboot gitlab is not started in the bootup provess.



Jeremy Davis's picture

Argh! Bloody GitLab! One of the most painful appliances to maintain...!

Ok, sorry for my rant. I'll take a few deep breaths...

I don't yet have a fix, but I can confirm the bug. FWIW, on face value, it appears that between my final test of the new appliance and the final build (~1-2 hours) they released a new version which was either buggy or at least not compatible with our setup. Argh!

I want to blame GitLab (because I really do hate it) but really it's my fault. I should have done a final test (after doing the final build, destined for publishing). But I'd spent so long working on it, it was so late on Friday when I finally got it working and so resource intensive and so slow to build/start and therefore test (plus, I'd tested my local build so intensively), that I skipped that step. I can only assume that within that 1-2 hour window, they released a new version that broke things for us... Following my previous experience with GitLab, I should have known that would bite me...

I can't yet confirm, but I suspect that installing (i.e. updating to) the latest GitLab version should fix it. I had hoped to be able to post back to confirm (or not as the case maybe) by now, but after 20 minutes, it's still only 60%. In the meantime, here's the command to update GitLab CE (run as root user, or prefix with 'sudo'):

apt update
apt install gitlab-ce
Jeremy Davis's picture

Sorry, I can now confirm that even updating GitLab-CE doesn't work. I'm still not clear what the issue is, but it appears to run deeper than I suspected...

In the meantime, I've reverted the GitLab appliance page back to the previous v16.0.

Veronica Charlton's picture

I understand how things go sometimes. Been around the block a few times. 

My environment is on a closed network. I will grab the current one linked and bring it up. Theen move it over and check it out.



Veronica Charlton's picture

I installed the 1.60 version and it appears to go through the initializatino process properly. When I check using gitlab-ctl status I see everything I would expect running.

When I attemtp to access it through the web interface it does not connect. I can ping the VM so I know that part is working.

When I do a netstat I do not see any connections to port 80. Gitlab seems to be only connection is the localhost address ( I have an external DNS that is resolving the IP correctly.

I am also able to access the webadmin interface for the server itself.

That is where things stand at the moment. 

badco's picture

Can you show the error you get when you try to access the webUI? (screenshot would be good)

Veronica Charlton's picture

I appologize for not getting back to you.


In all honesty I installed Centos in a new appliance and then gitlab onto that.


That is working and I have not gone back to try and figure out why the turnkey one isn't.

badco's picture

Alternatively you could use TKLCore LXC and install gitlab into that, then you get webmin and other features.

Michael's picture

I have the same problem like the first comment. I can give a small Workaround for one part of the problem:

For restart GitLab after reboot you can do following on console:

systemctl enable gitlab-runsvdir
systemctl start gitlab-runsvdir


But for the 500-Error i have no solution yet.

Alex's picture

I'm trying to understand the gitlab 500 error but it's not easy.

The command `gitlab-rake gitlab:check --trace` report no issue:

root@git .../log/gitlab#  gitlab-rake gitlab:check --trace
** Invoke gitlab:check (first_time)
** Invoke gitlab_environment (first_time)
** Invoke environment (first_time)
** Execute environment
** Execute gitlab_environment
** Execute gitlab:check
Checking GitLab subtasks ...

Checking GitLab Shell ...

GitLab Shell: ... GitLab Shell version >= 13.17.0 ? ... OK (13.17.0)
Running /opt/gitlab/embedded/service/gitlab-shell/bin/check
Internal API available: OK
Redis available via internal API: OK
gitlab-shell self-check successful

Checking GitLab Shell ... Finished

Checking Gitaly ...

Gitaly: ... default ... OK

Checking Gitaly ... Finished

Checking Sidekiq ...

Sidekiq: ... Running? ... yes
Number of Sidekiq processes (cluster/worker) ... 1/1

Checking Sidekiq ... Finished

Checking Incoming Email ...

Incoming Email: ... Reply by email is disabled in config/gitlab.yml

Checking Incoming Email ... Finished

Checking LDAP ...

LDAP: ... LDAP is disabled in config/gitlab.yml

Checking LDAP ... Finished

Checking GitLab App ...

Git configured correctly? ... yes
Database config exists? ... yes
All migrations up? ... yes
Database contains orphaned GroupMembers? ... no
GitLab config exists? ... yes
GitLab config up to date? ... yes
Log directory writable? ... yes
Tmp directory writable? ... yes
Uploads directory exists? ... yes
Uploads directory has correct permissions? ... yes
Uploads directory tmp has correct permissions? ... skipped (no tmp uploads folder yet)
Init script exists? ... skipped (omnibus-gitlab has no init script)
Init script up-to-date? ... skipped (omnibus-gitlab has no init script)
Projects have namespace: ... 
GitLab Instance / Monitoring ... yes
Redis version >= 5.0.0? ... yes
Ruby version >= 2.7.2 ? ... yes (2.7.2)
Git version >= 2.31.0 ? ... yes (2.31.1)
Git user has default SSH configuration? ... yes
Active users: ... 1
Is authorized keys file accessible? ... yes
GitLab configured to store new projects in hashed storage? ... yes
All projects are in hashed storage? ... yes

Checking GitLab App ... Finished

Checking GitLab subtasks ... Finished

root@git .../log/gitlab#


The command `gitlab-ctl tail` report the error:

==> /var/log/gitlab/gitlab-rails/production.log <==
OpenSSL::Cipher::CipherError ():
lib/gitlab/database.rb:342:in `block in transaction'
lib/gitlab/database.rb:341:in `transaction'
app/services/application_settings/update_service.rb:50:in `update_settings'
lib/gitlab/metrics/instrumentation.rb:160:in `block in update_settings'
lib/gitlab/metrics/method_call.rb:27:in `measure'
lib/gitlab/metrics/instrumentation.rb:160:in `update_settings'
app/services/application_settings/update_service.rb:12:in `execute'
lib/gitlab/metrics/instrumentation.rb:160:in `block in execute'
lib/gitlab/metrics/method_call.rb:27:in `measure'
lib/gitlab/metrics/instrumentation.rb:160:in `execute'
app/controllers/application_controller.rb:531:in `disable_usage_stats'
app/controllers/application_controller.rb:517:in `set_usage_stats_consent_flag'
app/controllers/application_controller.rb:463:in `block in set_current_context'
lib/gitlab/application_context.rb:70:in `block in use'
lib/gitlab/application_context.rb:70:in `use'
lib/gitlab/application_context.rb:27:in `with_context'
app/controllers/application_controller.rb:454:in `set_current_context'
lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call'
lib/gitlab/middleware/rails_queue_duration.rb:33:in `call'
lib/gitlab/metrics/rack_middleware.rb:16:in `block in call'
lib/gitlab/metrics/transaction.rb:56:in `run'
lib/gitlab/metrics/rack_middleware.rb:16:in `call'
lib/gitlab/request_profiler/middleware.rb:17:in `call'
lib/gitlab/jira/middleware.rb:19:in `call'
lib/gitlab/middleware/go.rb:20:in `call'
lib/gitlab/etag_caching/middleware.rb:21:in `call'
lib/gitlab/middleware/multipart.rb:172:in `call'
lib/gitlab/middleware/read_only/controller.rb:50:in `call'
lib/gitlab/middleware/read_only.rb:18:in `call'
lib/gitlab/middleware/same_site_cookies.rb:27:in `call'
lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call'
lib/gitlab/middleware/basic_health_check.rb:25:in `call'
lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call'
lib/gitlab/middleware/request_context.rb:21:in `call'
config/initializers/fix_local_cache_middleware.rb:11:in `call'
lib/gitlab/middleware/rack_multipart_tempfile_factory.rb:21:in `call'
lib/gitlab/metrics/requests_rack_middleware.rb:76:in `call'
lib/gitlab/middleware/release_env.rb:12:in `call'


I also find an error in the postgresql which was not present in the previous package:

2021-07-21_21:05:08.72639 LOG:  configuration file "/var/opt/gitlab/postgresql/data/postgresql.conf" contains errors; unaffected changes were applied
2021-07-21_21:05:08.95919 LOG:  no match in usermap "gitlab" for user "gitlab" authenticated as "root"
2021-07-21_21:05:08.95921 FATAL:  Peer authentication failed for user "gitlab"
2021-07-21_21:05:08.95921 DETAIL:  Connection matched pg_hba.conf line 70: "local   all         all                               peer map=gitlab"
2021-07-21_21:05:09.04222 LOG:  no match in usermap "gitlab" for user "gitlab" authenticated as "root"
2021-07-21_21:05:09.04223 FATAL:  Peer authentication failed for user "gitlab"
2021-07-21_21:05:09.04224 DETAIL:  Connection matched pg_hba.conf line 70: "local   all         all                               peer map=gitlab"
2021-07-21_21:05:09.10794 LOG:  no match in usermap "gitlab" for user "gitlab" authenticated as "root"
2021-07-21_21:05:09.10796 FATAL:  Peer authentication failed for user "gitlab"
2021-07-21_21:05:09.10796 DETAIL:  Connection matched pg_hba.conf line 70: "local   all         all                               peer map=gitlab"

But the config line seems identical to the preivous package.


I will continue to investigate a bit.


Add new comment