You are reading a single comment by @NotThamesWater and its replies. Click here to read the full conversation.
  • Rbert uses tensor flow via reticulate. If tensor flow automatically offloads to gpu you may have just solved a massive pita I've been having with it! My old amd gpu may not be up to snuff. Will need to look into this when I've got time. Thanks! (I ended up doing some analysis using glove embeddings recently because bert was shitting the bed and I ran out of time).

  • Most need Nvidia cards, akaik.

    I'm not really using any of them in anger just yet though- just fucking around with stable diffusion.

    After I've (finally, lel) finished my dissertation, I'll be training up a LLAMA2 model (among others). It's for a commercial venture, though, so money will be thrown at it, and it'll be trained on leased and scaleable GPU.

  • Sounds interesting! What's your field? I'm not really a 'data scientist,' just pretend to be one in front of students (or occasional employers). I'm more of a substantive user of methods - applying them to particular domains to - hopefully - say scholarly interesting things. As well as being a critical commentator, I suppose.

    Btw, Google says Tensor Flow does default to CPU if the GPU isn't compatible (so that may not be the bug(s) I'm facing - could be the AMD CPU is incompatible some how, or just not able to do the stuff I'm asking it to do).

    Anyway, guess I'm buying an Nvidia card. Back to the drawing board.

About