From the course: Rust LLMOps

Unlock this course with a free trial

Join today to access over 23,100 courses taught by industry experts.

Invoke an LLM on an AWS G5 instance, part 2

Invoke an LLM on an AWS G5 instance, part 2 - Rust Tutorial

From the course: Rust LLMOps

Invoke an LLM on an AWS G5 instance, part 2

- [Instructor] So you'd go here, copy it, and clone it. Once you've done that, in case I've already done that, I could just say, git pull, right? Get whatever latest updates they're actively changing this thing, and then I'm ready to go. And how would I actually run things now? Well, you can look at the instructions, if we go down here, and we would want to actually, you know, basically build one of these models. So there's two ways to actually invoke these large language models. One is to just run it. But I think a better way to invoke it would be to build it, then go to the target directory, and then play around with the model. So let's take a look at this one, for example. BigCode is essentially a competitor to GitHub Copilot, but it's open source. So if we wanted to actually run this, how would we do this? We would just go through here and start a command like this. But we want to tweak this a little bit. So we want to say cargo build, right? Because we actually want to build a…

Contents