Message boards :
News :
[TWIM Notes] Feb 1 2021
Message board moderation
Author | Message |
---|---|
Send message Joined: 30 Jun 20 Posts: 462 Credit: 21,406,548 RAC: 0 ![]() ![]() ![]() ![]() |
This Week in MLC@Home Notes for Feb 1 2021 A weekly summary of news and notes for MLC@Home Summary Paper coming together. The first round of datasets are cut and I will release them by the end of this week. Only issue is their size. I will likely make the full, larger datasets available as a torrent. Stay tuned for some new results and the big dataset release this week. The paper will follow shortly afterwards. Detailed News
|
Send message Joined: 11 Jul 20 Posts: 33 Credit: 1,266,237 RAC: 0 ![]() ![]() ![]() ![]() |
his means we'll start mixing in other types of WUs than just DS2 WUs into the GPU queue again, balancing between the remaining DS1/DS2 and DS3 WUs. When DS4 is ready, we'll rebalance again. Is there a "minimum" gpu for this project? I have an entry level RX 550x and i don't know if it is usable... |
Send message Joined: 30 Jun 20 Posts: 462 Credit: 21,406,548 RAC: 0 ![]() ![]() ![]() ![]() |
GPUs need 2GB ram min. RX550 is the minimum AMD GPU that will be supported, as I think the RX540 is not POLARIS-based (gfx803 in AMD nomeclature). POLARIS needs an updated client compiled against a later version of rocm (rocm 3.8 has a bug that keeps it from running on POLARIS). It's on the todo list but not ready yet. For now, the only AMD GPUs supported are VEGA-based (gfx900/906/908, not gfx902 APUs) and must be running Linux. Last night I just got pytorch working with rocm4., so that should be easier in the future. There's *some* support for NAVI in rocm, but its untested. Note AMD APUs are not supported by rocm, and thus aren't supported by pytorch. Yes that's sad. On the CUDA side, you need 2GB ram and compute capability 3.5 or higher. That's,... most GTX 700 series and up I think? |
Send message Joined: 1 Jul 20 Posts: 34 Credit: 26,118,410 RAC: 0 ![]() ![]() ![]() ![]() |
I am not clear on how to get tasks from the "test" queue. According to the apps page, it requires "Linux running on an AMD x86_64 or Intel EM64T CPU". But if I understand correctly from the above posts, the app is for a GPU, not CPU. FWIW, I do have an AMD VII and the settings are configured to allow both GPU and CPU for the "test" application. Also the server status page says there are 40 tasks available with 8 in progress. Yet I am unable to get any, either for CPU or GPU. What am I doing wrong? Reno, NV Team: SETI.USA ![]() |
Send message Joined: 28 Jan 21 Posts: 1 Credit: 902,707 RAC: 8 ![]() ![]() |
Hey everyone, I'm glad to see the RX 550 GPU is staying on the list of supported GPUs. This is the only BOINC Project I am running atm I am happy to answer any questions the leaders would like to ask about my rig. Or if someone could go more into detail about running the program more efficiently or strenuously. I plan on upgrading very soon my GPU to put it in a smaller rig. I will stick around the boards for the next few posts So long and thanks for all the fish. |
Send message Joined: 4 Dec 20 Posts: 32 Credit: 47,319,359 RAC: 0 ![]() ![]() ![]() |
I've seen you are running only cpu-wu's. If you get your RX550 up and running please leave a note here. I have a backup system with an RX570 and could easily install another disk with linux - if there is a chance to get it working. |
Send message Joined: 30 Jun 20 Posts: 462 Credit: 21,406,548 RAC: 0 ![]() ![]() ![]() ![]() |
The AMD client *really* needs an update. Maybe I'll try and give it some love this weekend. PyTorch 1.8 with official ROCm support was just released today, so maybe its time to refresh all the clients. Plus pytorch updated their static compilation options, so maybe we can go that route this time too. That would be a huge win. The current AMD client won't support polaris (rx5xx) due to a bug in the version of rocm its linked against (3.8). Also, I know of only one other person who has gotten the rocm client to work, and their WUs revealed there's still a library trying to link against their system's version of miopen (a rocm library) instead of the one we ship (not a huge deal, it'll work, but it should not be doing that). Let me see what I can do this weekend. |
©2023 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)