If you are going to spend money to build a rig, I would recommend against trying to do portable. When you are training, you will find that your battery is your new constraint.
If you are cash constrained (I was), I put together a $235 box and was able to run all the exercises and all the architectures for lessons 1 and 2 (2018) on this box. I got a refurbished Dell i5 desktop for $150 and added a GT 1030 ($85). I bought a KVM so I could share my monitor, keyboard and mouse for about $120, but you could probably buy a keyboard mouse and monitor for less. You don’t really need a high end monitor to run the lessons.
In another thread I saw someone say that this kind of rig is only for people who are not serious abut DL. I beg to differ. I’m a dinosaur and needed to do a lot of catching up. It was an excellent choice. Installing fastai in my own environment was really helpful for later troubleshooting. Having only 2GB memory forced me to play around with the parameters (batch size was the key) in the lessons.
In the breeds exercise, just before the learn.save(‘224_pre’), I was able to get a (slightly) better intermediate result for validation loss and accuracy, than the one shown in the lesson:
Lesson shows: array([0.25087]), 0.918786694858872]
my result: 0.24380622941220575, 0.9212328767123288
In reading all the entries in this wiki, and didn’t see anyone say that every time you change your hardware, it’s time to update your recovery plan. TAKE A BACKUP! KEEP IT CURRENT! I’m sure @jeremy will endorse this advice. This isn’t an issue if you are using colab or paperspace or AWS, but for Win10 anyway, it could save you a lot of time.
Despite the gap in the calendar, I’m including the following:
Both my CPU and GPU go from room temp (about 28 C) to 79-80 C when I’m running a learning step. I know I’m under powered, but could you be overpowered?
I took a chance and got a card that calls for a somewhat bigger power supply than my box has. So far I haven’t melted the motherboard.
Not sure what the exchange rate is, but I think my rig is about as cheap as can me made and still work the lessons. Unfortunately it isn’t upgradable. Really, the question is if you want to learn the fastai stuff, stick with learning the fastai stuff. That way you really know when you have found a hardware constraint.
I can assure you that having only 2GB can be painful.
I’m working with less, but your approach is upgradable.
It really depends on whether you want to learn fastai or want to build a powerful rig and have the money and time to do so. Not an expert on the cpu side, but my rig shows cpu 100
% busy when the learn epochs start.
I have almost no money. I have a GT 1030 with 2GB. Since I got it squared away, I haven’t had to resort to colab, paperspace or AWS. Only drawback I’ve found is that sometimes I have to go drink a cold one or two, or leave it running overnight to complete a learn.
Sorry for going on so long. I’m still catching up.