Posts by An0ma1y

1) Message boards : Cafe : How does MLC verify results without running multiple tasks per work unit (redundancy)? (Message 1138)
Posted 7 Apr 2021 by An0ma1y
Post:
hmm, i hadn't noticed this, but this is a very good question, i would like to know this as well.
2) Message boards : News : [TWIM Notes] Feb 23 2021 (Message 1122)
Posted 10 Mar 2021 by An0ma1y
Post:
then my opinion based on this response is to focus on said anti cheat measures first, because i would not like that either.
the validation problem seems like something to look at and focus on as well.

guess i don't really know beyond that
3) Message boards : News : [TWIM Notes] Feb 23 2021 (Message 1110)
Posted 28 Feb 2021 by An0ma1y
Post:
Agree with the above. I recently got into gridcoin so it would be fairly good timing for me personally. Aside from that, are there reasons not to do it? Or to postpone it?
4) Questions and Answers : Issue Discussion : GPU Utilization and Resource Requirements (Message 1098)
Posted 21 Feb 2021 by An0ma1y
Post:


An option would be to do something similar to Einstein@Home where GPU utilization factor can be set in user preferences, instead of having to make our own `app_info.xml` file. In this way you offload the problem of resource scheduling to the BOINC client, for which it's quite well designed to do, and for free.

Agree!!! That is why I do not run GPU work here very often as I do not customize my config. files for individual projects. The manner in which Einstein does it makes it very easy to tune # wu's/GPU. My 2 cents.
Cheers

Yes, please!
5) Questions and Answers : Issue Discussion : All my GPU applications have crushed. (Message 795)
Posted 10 Nov 2020 by An0ma1y
Post:
not all my gpu apps have crashed but a decent number have for me.

I am running a rtx 2070 s and also a gtx 950

they don't seem to get much load whatsoever, my 2070 s seems to get like maybe 1% load according to task manager, i find that my readings tend to be somewhat incorrect however because of running an rtx 2070 s with a 950, but i dont really know.

Most of my WU's complete successfully, but i just wanted to make sure something isn't wrong.

Also, on win 10 with the nvidia control panel, i am rather curious, would the option under 3d settings "optimize for compute performance" help with gpu WU's whatsoever? I noticed it a while back after a driver update.
6) Message boards : News : [TWIM Notes] Oct 12 2020 (Message 648)
Posted 13 Oct 2020 by An0ma1y
Post:
Thank you for the update.

Updates like this are the only thing that make me feel sane anymore, when i can delve into this stuff, i feel at peace, somewhat whole.

It's a pleasant distraction from 2020.

Thank You again, as bozz stated, the transparency is something I also love!

Take Luck and Care!
7) Message boards : News : [TWIM Notes] Sep 14 2020 (Message 496)
Posted 17 Sep 2020 by An0ma1y
Post:
For the record, I don't know where the rumor of a minimum 2xxx series card started, but its not true.

We'll be bound by the minimum version of cuda that pytorch supports. For pytorch 1.6, this is 9.2 I think.. so any card that's supported by cuda 9.2 should be supported for this. A high end card shouldn't be necessary, especially since (at the moment) the goal isn't to train start of the art over-parameterized networks on gigabytes of data. A simple 970 or 1050 should be plenty.

I also want to re-iterate that, at the moment, the networks in dataset 1 and 2 (and probably 3, though not actually tested) actually train slower on GPUs than CPUs (maybe with a substantial client rewrite it could be better). Sometimes the networks are so small the overhead of using/transferring data to/from the GPU dwarfs the speedup gained in the actual matrix calculations.


Thank You for clearing things up!

I think it mostly has to do with 2xxx series and 3xxx series having the tensor core coprocessor. A kind of assumption based on that. It's what i was assuming to some degree.

But it would definitely be much nicer to be able to use my extra gpu's that aren't 2xxx series for this, i use my 2070 S for gaming right now so yeah, not that i wouldn't dedicate some resources from it when its not in use of course
8) Message boards : News : [TWIM Notes] Sep 14 2020 (Message 483)
Posted 16 Sep 2020 by An0ma1y
Post:
What could we expect in terms of supported GPU's?

I read someone mentioning 2080 or higher as a possibility. but i have a 2070 super.

Thank You for your efforts :>




©2022 MLC@Home Team
A project of the Cognition, Robotics, and Learning (CORAL) Lab at the University of Maryland, Baltimore County (UMBC)