Heelo
I am systematically crashing (500) my session when I try to open or download one of the "big" help files. Smaller files opens OK.
Anyone to check if I am not crazy?
Thanks
Heelo
I am systematically crashing (500) my session when I try to open or download one of the "big" help files. Smaller files opens OK.
Anyone to check if I am not crazy?
Thanks
Hi Olivier, *
Heelo
I am systematically crashing (500) my session when I try to open or download
one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
Might be caused by the database backup that is run every three hours -
when the backup is run, it locks the table for other access, thus
pootle is basically non-functional during that time (backup takes
somewhat around 5 minutes).
And what is "open or dowload" exactly?
When you request zips for download, I can only write what I've written
many times already: It won't help to click more links on one page in
the hope of getting the files in parallel. Pootle just doesn't work
that way. Click the first you want, then wait until it is served, then
you can click the others at once.
ciao
Christian
I get a premature end of script whatever I tried to upload a file (UI so quiet small) or open one:
Premature end of script headers: wsgi.py
Will try during the night may be that will be easier
Kind regards
Sophie
Hi Sophie,
I am systematically crashing (500) my session when I try to open or
download one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
I get a premature end of script whatever I tried to upload a file (UI so
quiet small) or open one:
Premature end of script headers: wsgi.py
Please: When /exactly/ do you see this.
Will try during the night may be that will be easier
During the night will not happen either if you do it just when the
mysqldump runs.
ciao
Christian
Hi Christian,
Hi Sophie,
I am systematically crashing (500) my session when I try to open or
download one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
I get a premature end of script whatever I tried to upload a file (UI so
quiet small) or open one:
Premature end of script headers: wsgi.pyPlease: When /exactly/ do you see this.
If I want to overwrite a file for example, when hitting the upload button, after a little while, I get this error message.
Will try during the night may be that will be easier
During the night will not happen either if you do it just when the
mysqldump runs.
Well, I need to find a moment where I can upload them anyway
Kind regards
Sophie
Hi Sophie, *,
I am systematically crashing (500) my session when I try to open or
download one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
I get a premature end of script whatever I tried to upload a file (UI so
quiet small) or open one:
Premature end of script headers: wsgi.pyPlease: When /exactly/ do you see this.
If I want to overwrite a file for example, when hitting the upload button,
after a little while, I get this error message.
With when I did mean the time, but what exactly the actions are helps
of course as well to recreate the problem.
Again: When exactly? At what time did you try it.
I want to compare that to the time when the mysqldumps are run and to
see what munin did record for that time, i.e. where the bottleneck is.
Well, I need to find a moment where I can upload them anyway
Well, what about just retrying right now?
ciao
Christian
Server error!
The server encountered an internal error and was unable to complete
your request.
Error message:
Premature end of script headers: wsgi.py
If you think this is a server error, please contact the webmaster.
Error 500
translations.documentfoundation.org
Wed Apr 13 18:07:41 2011
Apache
Hi Christian
Hi Olivier, *
Heelo
I am systematically crashing (500) my session when I try to open or download
one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
Might be caused by the database backup that is run every three hours -
when the backup is run, it locks the table for other access, thus
pootle is basically non-functional during that time (backup takes
somewhat around 5 minutes).And what is "open or dowload" exactly?
In the folder
https://translations.documentfoundation.org/pt_BR/libo34x_help/swriter/
the file 01.po has 2 words to review. If I click on "2 words need attention", I get the crash everytime.
wsgi.py as Sophie reported
Hi Olivier,
Hi Christian
Hi Olivier, *
Heelo
I am systematically crashing (500) my session when I try to open or
download
one of the "big" help files. Smaller files opens OK.Anyone to check if I am not crazy?
Might be caused by the database backup that is run every three hours -
when the backup is run, it locks the table for other access, thus
pootle is basically non-functional during that time (backup takes
somewhat around 5 minutes).And what is "open or dowload" exactly?
In the folder
https://translations.documentfoundation.org/pt_BR/libo34x_help/swriter/the file 01.po has 2 words to review. If I click on "2 words need
attention", I get the crash everytime.wsgi.py as Sophie reported
I've discussed this with Christian on irc and he is currently monitoring what is happening and how.
Kind regards
Sophie
Hi Anton,
Thanks, for reporting, Christian is currently evaluating what might happen with the database.
Kind regards
Sophie
Hi at all,
Sorry Christian, my english is not fluent and that's reason for not
comment in retaill the problems.
Perhaps Friedel can help more here. I think he was supporting the
OOo.org pootle's instalation, that is rolling perfectly.
Can I suggest temporaly hide Help files? Is possible that diff and
mess functions in .po overload the server.
Antón
Hi Anton, *,
Sorry Christian, my english is not fluent and that's reason for not
comment in retaill the problems.
Perhaps Friedel can help more here. I think he was supporting the
OOo.org pootle's instalation, that is rolling perfectly.
Well, they have tons of RAM (IIRC the server has 12GB of RAM, and is
only used for pootle) available and can put the whole database into
RAM without problem, then the drawbacks on how pootle uses the
database, and also the drawback of having table locks with myISAM
doesn't impact performance that much.
But we're running pootle inside a VM, and there is not enough ram on
the host to allow file caching in this magnitude.
The problem is that when it needs to access the disk to read the
database, it can easily come to a point where user A requests data
that is on block a on the disk, and user B requests data that is on
block b, but both are not in RAM at the same time, so it has to
constantly read stuff from disk - and that is slow. Even worse is when
some lengthy process is run over the whole database (like when
uploding files and regenerating statistics, and similar)
Can I suggest temporaly hide Help files? Is possible that diff and
mess functions in .po overload the server.
I now tweaked both mysql's parameters as well as the setup of the
virutalbox VM so that now parts of the databse can be cached by the
host, so that the VM doesn't need to do real disk i/o anymore (at
least now to a much lesser extent).
While not completely eliminating wait times (for example when
mysqldump is running, it locks the table and thus during mysqldump no
other accesses to the table are possible), those waiting times should
now much shorter (as mysqldump completes faster, all other slow
operations also should complete faster, thus it is less likely to get
into a "I want block A, but other user wants B, yet another one wants
D" at the same time.
ciao
Christian
Hi Anton, *,
Sorry Christian, my english is not fluent and that's reason for not
comment in retaill the problems.
Perhaps Friedel can help more here. I think he was supporting the
OOo.org pootle's instalation, that is rolling perfectly.Well, they have tons of RAM (IIRC the server has 12GB of RAM, and is
only used for pootle) available and can put the whole database into
RAM without problem, then the drawbacks on how pootle uses the
database, and also the drawback of having table locks with myISAM
doesn't impact performance that much.But we're running pootle inside a VM, and there is not enough ram on
the host to allow file caching in this magnitude.The problem is that when it needs to access the disk to read the
database, it can easily come to a point where user A requests data
that is on block a on the disk, and user B requests data that is on
block b, but both are not in RAM at the same time, so it has to
constantly read stuff from disk - and that is slow. Even worse is when
some lengthy process is run over the whole database (like when
uploding files and regenerating statistics, and similar)Can I suggest temporaly hide Help files? Is possible that diff and
mess functions in .po overload the server.I now tweaked both mysql's parameters as well as the setup of the
virutalbox VM so that now parts of the databse can be cached by the
host, so that the VM doesn't need to do real disk i/o anymore (at
least now to a much lesser extent).While not completely eliminating wait times (for example when
mysqldump is running, it locks the table and thus during mysqldump no
other accesses to the table are possible), those waiting times should
now much shorter (as mysqldump completes faster, all other slow
operations also should complete faster, thus it is less likely to get
into a "I want block A, but other user wants B, yet another one wants
D" at the same time.ciao
Christian
Understanded.
It's logical. At this moment it goes very well and sounds as promising.
Cheers