Message boards :
News :
Temporarily work unavailabilities
Message board moderation
Author | Message |
---|---|
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
Dear participants, after sending last tasks for target 20 (corona_DHODH_v1) we will start the experiment with one of already processed targets but with new parameters. Results for this tasks will be large and we will slow down the creation of new tasks for reducing load on network infrastructure. Often there will be no tasks. Thank you for project support! |
Send message Joined: 24 Oct 20 Posts: 12 Credit: 9,492,331 RAC: 78,651 |
By “large”, do you mean long run times? And if so, will machines like Raspberry Pis be able to return the tasks by the deadlines? Reno, NV Team: SETI.USA |
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
By “large”, do you mean long run times? And if so, will machines like Raspberry Pis be able to return the tasks by the deadlines? Hi! Large by size, but short by run time. |
Send message Joined: 10 Feb 22 Posts: 31 Credit: 478,897,665 RAC: 268,345 |
Is compression maximized already or would packing the outgoing work unit and incoming result in a 7z archive in max compression mode result in noticeable smaller transfer size? I have no idea if it's feasible without a lot coding to automate reliably but it may be worth a look |
Send message Joined: 3 Dec 20 Posts: 6 Credit: 1,910,222 RAC: 3,129 |
Some of the new work is being released and they are processing nicely. I have completed 3 of 5 distributed successfully and the other 2 are running now. Bill F In October 1969 I took an oath to support and defend the Constitution of the United States against all enemies, foreign and domestic; There was no expiration date. |
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
Is compression maximized already or would packing the outgoing work unit and incoming result in a 7z archive in max compression mode result in noticeable smaller transfer size? I have no idea if it's feasible without a lot coding to automate reliably but it may be worth a look Hi! :) Good idea. Packing into zip archive adds some steps on results processing, but maybe we implement it. |
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
Some of the new work is being released and they are processing nicely. I have completed 3 of 5 distributed successfully and the other 2 are running now. Hi! :) I think that these were "repeated" tasks for the DHODH_v1 target. The first tasks for new experiment have just been generated. |
Send message Joined: 10 Feb 22 Posts: 31 Credit: 478,897,665 RAC: 268,345 |
zip is an old school format! .7z or other modern algorithms would probably provide far better compression ratio. You already use it, for instance on CmDock_v0.1.4_2.1_windows_x86_64.zip , 4.85MB in size. If i extract and place contents in a .7z with max compression, it is 3MB and database-type files should see an improvement too. Software solution is probably a lot easier than dealing with network hardware woes |
Send message Joined: 3 Jan 21 Posts: 11 Credit: 11,946,843 RAC: 15,528 |
Dear participants, after sending last tasks for target 20 (corona_DHODH_v1) we will start the experiment with one of already processed targets but with new parameters. Results for this tasks will be large and we will slow down the creation of new tasks for reducing load on network infrastructure. Often there will be no tasks. Thanks for that hoarfrost, the updates are very much appreciated. Just wondering about some of the recent work that has been released, on 2 near identical systems (Ryzen 5900x cpus), one Windows and one Linux, for the same run time the Windows computer gets 20 points but the Linux computer gets just 1 point, Been happening since the short test releases recently. I am very curious as to why the difference? Thanks Conan |
Send message Joined: 3 Jan 21 Posts: 11 Credit: 11,946,843 RAC: 15,528 |
Dear participants, after sending last tasks for target 20 (corona_DHODH_v1) we will start the experiment with one of already processed targets but with new parameters. Results for this tasks will be large and we will slow down the creation of new tasks for reducing load on network infrastructure. Often there will be no tasks. My other Linux computer gives even less at just 0.6 points per WU. I may reset the project to see if anything changes. Conan |
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
Hello Conan! Do this computers have the right benchmark? |
Send message Joined: 3 Jan 21 Posts: 11 Credit: 11,946,843 RAC: 15,528 |
Hello Conan! Do this computers have the right benchmark? G'Day hoarfrost, As far as I can tell they have, I don't alter or stop BOINC generating Benchmarks as it will always gives very low points (e.g. stopping benchmarks gives 1 billion floating point and 1 billion integer results, my computers are many times that figure). Any time I see low points that 1 billion figure for both benchmark results is nearly always the cause and I hate it. I have not received any work on my Linux machines to see if the resets made any difference. Thanks Conan |
Send message Joined: 3 Jan 21 Posts: 11 Credit: 11,946,843 RAC: 15,528 |
Resetting the computers made no difference to awarded points, just 0.69 for the latest Linux work unit. (My computer are not hidden so you can see the benchmark results for both Linux computers.) Thanks Conan |
Send message Joined: 23 Dec 20 Posts: 20 Credit: 1,360,768 RAC: 0 |
I only have Linux PCs (some old) now and am only getting 1 or 2 credits per WU, not sure what I got before. PCs visible. Paul. |
Send message Joined: 8 Sep 21 Posts: 13 Credit: 3,070,202 RAC: 3,221 |
hoarfrost, Is there going to be any increase in output of tasks? My RAC is dying on your project. I just got 2 tasks at 2100 CET and it takes just around 4 hrs to run them. So my system is always ready for more and I don't get them all that often now. |
Send message Joined: 8 Sep 21 Posts: 13 Credit: 3,070,202 RAC: 3,221 |
I only have Linux PCs (some old) now and am only getting 1 or 2 credits per WU, not sure what I got before. That's all I am getting as well. .83 to 1 credit per task. But now with this server issue...the RAC goes from 3,000 to 1,000 and now 500. Unreal. |
Send message Joined: 2 Feb 21 Posts: 1 Credit: 22,461,851 RAC: 9,457 |
I prefer .rar and you can set compression setting in 7z ( got popular because it's free) , .zip majority people get suckered paying and Winrar too when the work for years after. But dude tar.gzip ( tar.gz ) FTW |
Send message Joined: 10 Feb 22 Posts: 31 Credit: 478,897,665 RAC: 268,345 |
It seems to me that this is not, or more than a network bandwidth issue step. I don't think work unit throughput comes close to equaling previous network utilization. |
Send message Joined: 11 Oct 20 Posts: 339 Credit: 25,713,477 RAC: 7,975 |
Hello! I think that problems with granted credit for some computers is caused by imbalance in the estimation of the task completion time. We fixed it and watching. |
Send message Joined: 3 Jan 21 Posts: 11 Credit: 11,946,843 RAC: 15,528 |
Hello! I think that problems with granted credit for some computers is caused by imbalance in the estimation of the task completion time. We fixed it and watching. Thanks for that hoarfrost, Conan |
©2024 SiDock@home Team