Conf42 Machine Learning 2025 - Online

- premiere 5PM GMT

AI-powered Productivity: Boosting Productivity for Software Engineers using Open-Source LLMs

Video size:

Abstract

AI tools are revolutionizing how solo entrepreneurs manage their businesses. In this talk, I’ll showcase practical AI applications that enhance efficiency, from automating workflows to scaling content creation helping entrepreneurs maximize productivity with minimal resources.

Summary

Transcript

This transcript was autogenerated. To make changes, submit a PR.
Hi everyone. My name is Uncle Big B. I'm a software engineer at Ash o and a technical educator. Today I'm really excited to be talking to you about AI powered productivity, which involves boosting productivity for software engineers using open source lms. So let's get into it. This is going to be our agenda for today. We are going to begin by understanding why developer productivity matters, and then take a look at what if coding wasn't the bottleneck anymore in our workflow, and then dive into AI adoption among developers. And we are also going to take a look at why open source lms and then. What you can actually do with these lms. And then we'll explore architectural overview of how things work locally in our LMS development. And then we are going to take a look at some use cases of these LMS as well. And then we are going to take a look at AI tools that you can use that are beyond charge. We are also going to take it at setting up an open source AI dev tool locally, and we'll take a look at the performance and caution of using open source LLMs on your local machine. And we are going to end it with best practices for productive AI usage. So let's begin with the first one. Why developer productivity matters. Developer productivity isn't just about writing more code faster, but then it is about building high quality software with less friction shipping features faster, and spending more time solving reward problems. Rather than getting stock on repetitive tax, and the first point on why developer productivity matters is faster shipping of high quality software, because as teams scale and product demands grow, engineers are expected to deliver more. Collaborate better and maintain quality all at once. And with this developers are going to be able to spend less time on boiler plate and repetitive work. And with an effective productivity approach, developers are also going to be able to focus their energy on solving reward problems. Rather than spending a lot of times on bola code and repetitive work. And with this point, this proves that AI is going to help amplify developers that leverage ai rather than replacing us. So when you leverage ai, it's going to amplify your skill and not immediately replace you. So with this, I'm going to leave you with this false quote, which says, AI is not a threat. It's a productivity amplifier, so AI is not coming for your job, but if you leverage on AI and position yourself rightly with ai, AI is going to amplify your productivity. I'm also going to leave you with this question. What if coding wasn't the bottleneck anymore? Let me ask you something. What slows you down the most in your workflow? Is it the boiler code or repetitive boring stuff? It is not always the hardest part of the code that slows us down, but basically the repetitive, the boiler played the setting up of state management by redox or things along that line. What if we could move past that friction and focus more on the real business side of things that solve real problems. And that is where AI comes in, especially the open source lms, where they can do all this repetitive work for us in a very short time. And they're not here to replace us. Like I said, they're here to remove that bottleneck from our workflow. So according to the chart put from the 2024 Stack Overflow Developer Survey ai has been becoming deeply integrated into the way we build software today. And this survey shows that about a year ahead. Most developers agreed that AI tools be more integrated into the key part of their workflow, which is coming into existence in 2025. More companies, more developers are leveraging more on AI for documenting, for testing, for generating UIs, for generating designs as well. Even designers are also using this to generate a lot of things there as well. These are not basically side tax but then AI is already building the backbone of a lot of engineering effort. This shows that in 20 24, 20 23, a lot of people are already adapting it and 2025, a lot of. More people are already making use of this. This shows that AI is not the future to come, but then it is the future that we are currently living in. Now let's talk about AI adoptions among developer and. AI is no longer an experiment in the, like the previous years, but then it is already becoming the mainstream. It is already becoming the normal team. Just the way we use Git and GitHub. And now in AI adoption among developers, we have 76. Percent of developers who are already using or planning to use AI tools in their development process in 2024. So you can see the numbers. The percentage of developers are already using it or planning to use it in 2024 compared to the numbers of developers that are already using it in 2025. So the number is already increasing. So the question is you ask yourself is what is holding you back to leverage most of these lms? Either open source or not open source. And also based on the survey as well, 62% were using AI tools in 2024 are up from 44% in 2023. So now imagine the numbers that we have in 2025 where the survey is run again. So this shows an increasing numbers of adoptions among developers. And you should also get on the train. And then we also have. On the same survey that 81% sites increase productivity as the top benefit of AI tools. So this also shows effectiveness amongst the developers. And like I said, these are not just small tasks. They're critical part of shipping quality softwares. So AI is already writing like back backend, the full front end of applications. And developers are realizing that using AI isn't optional. If you want to stay efficient. So in this age that we are now, our reality is that if you want to move fast, if you want to get things done faster, you. Have to leverage AI in some ways. Since you already know what you want to do, you can just leverage AI to achieve that faster. And AI is quietly becoming a different layer in the modern developer workflow, just like the way we have GI and we once have stack overflow in our mainstream of our workflow. Now let's discuss about why open source. LMS when it comes to using AI in developer workflows open source LMS are a game changer. And here it's right. First you get full control over the models. You are not limited by API, quotas, written means, or vendor decision because they are basically running on your own luer server. And then you can also save host this and opensource LLM also provides us with flexibility to customize pipelines, which you can run locally. And it is also great for speed cost. Data privacy, you can also fine tune them for your specific use cases. And that's basically incredible powerful in engineering teams, right? So that you can basically customize them to your own use case. And lastly, you can deploy this open source LMM. Anywhere with less or no API cost, you can also deploy them on your Looker hardware without having to pay for any cloud charges. Examples of these LMS are and LM Studio among others, which you can basically. Download them on your system and then run them locally without having to pay any extra charges. So let's talk about what you can actually do with Open Source LMS as a developer. These modules are not just for charge bot with. Gone beyond being impressed with chatting with lms. They can also help us across multiple parts of your workflow. They can help with code generation. You can use them for refactoring functions writing unit test and validating of logic which is what is our reality now. AI can do most of these things. They can talk to database. They can leverage integrations with super base using MCP. They can leverage with any other application using their A API or MCP. Then this makes things a lot faster. And also AI is already helping with the bogging like we see in the use case of Chrome Console ai, which helps you to. Deb Bug Network is used, which epi to debug errors in your console, and you don't have to switch between your development browser and LLM, so it's already integrated into your console. And this is very helpful. So another one is that you can leverage LMS to basically learn about an open source project. You can learn about. The GitHub repository, you can learn about the structure. The Wes are being done on the repository, just like for onboarding. So LMS make things very easier for you when it comes to like code base, question and answer as well. So another great thing that these LMS are great at is for documenting your code. You can leverage the local lms, you can fine tune them for your own use case, and then. Use them to document your code. And because you are using them locally or via open infrastructure, you can, build your own dev agent or even plug them into your id like via code or know them or three editor. And it's not just about. Answering que questions. It's about automating workflow as well, so you can plug these LMS into your workflow directly through documentation support for your developers and for your consumers as well. So now. Let's break down how local LLMs actually fit into your development workflow. So we'll take a look at the architectural overview of LLMs in local development. Now, imagine you are coding in via code, and instead of rea relying on a cloud service like open ais, API. Which has rate limit, you can basically run your own LLM locally. And this is all thanks to Ola Man, LM Studio or VLLM which you can explore. What you just need to do is to basically pair them with extensions like Continue Dev, which is greeted by the way, which you can basically use to integrate these LMS directly into your id giving you the co-pilot like suggestions. But powered by your local model. So these tools allows you to basically fine tune things into your own use case into your own preference so that they can do what you want directly, and that is what they will focus on. And this also gives you control over the prompts. You can control the prompts, the models, and the context that are available to these lms. Only for your own use case and your own data your own code base. So the LLM is going to basically run on your machine it's going to respect your memory and keeps your code base private as well based on how you want it, since it's already running on your looker infrastructure or your own cloud infrastructure as well. And these architecture are usually very fast, right? So they're very. Fast, they're cost effective and gives you the flexibility to build intelligent features without leaving your local dev environment. And then let me introduce you to a few tools that brings Open source LMS right into your workflow. And the first one here is the Ola Man for Loam model. Serving so you can explore this, you can visit their website. It is a very great tool to run lms locally for your local development. And another one that you can pair with the Allman is the continued the dev which is a vs code copilot alternative that allows you to run and, AI native. LMS are directly from your VS code, and the next one is A-V-L-L-M for optimized. Serving and scaling of LLMs. So if you want to run and optimize LLMs locally, you can also exploit these tools as we've discussed in the previous slide. Now that we've have an idea of the LLMs and the tools that we can exploit and now we can exploit them and what we can achieve from using them. Now let's take a look at the use cases of ai. Boosted developer tax. Yeah. These are the things we are probably doing every day. Not totally different, but then this is going to help you to see how you can leverage AI or integrate AI into most of these use cases. So the first one is good generation and you can make use of continu to death. To for this and AI can help you in writing repetitive code. Think of good endpoints or UI components. They can do it for you effectively. And writing of UI test is also. Something they're very good at writing prompt templates. And then they can help you to write this unit test in a particular language that you want. And they are also very good in generating Aries and commit, which are going to be auto-generated and much more organized. And you can also learn from these as an engineer as well. And they also helps you to generate architectural diagrams for other developers. If it is an open source project, it can help you to generate the architectural for your read me for the documentation. And all of these things are basically not Turing. They are our re live use cases and that our teams are already using. So all of these use cases are already, being used in real life project. And then you can basically. Build these workflows using tools that we've already mentioned from continue Dev to long chain with local hosted models like we've been saying. Ai, is already becoming our sidekick and it's part of our pack of today. So now let's talk about open source AI tools. That you can basically use if you are excited about integrating AI into your workflow. The good news is you don't need expensive subscription or closed API. There's a growing ecosystem of open source tools built for developers like you, like me, and they are designed to run locally. It can be customized and you can. Keep you can keep the control of how things works. So here are some of the best open source AI tools that you can start using today. And you can basically just mix them together like we've been saying. So the first one is the continuum. Do dev which you can basically pair with your editor such as via school trade and any other editor that. Support lms. And another one that is more popular is the causal and it is a VS code for built ally around ai. So you can also leverage this one as well. And you can also leverage the Tabby ml, which is basically a lightweight open source auto complete for developer. Two. And you can also take a look at the LM studio, which also run local module, and it makes your work very smooth. So the last two I have here is the open deving, which is very good for building autonomous dev agent. So if you want to build your own custom dev agents that can do specific things for you the way you want it, with your own prompting, with your own rules, with your own data. Then you can also leverage this. This is a table that list the tools and compares them based on their own use cases. If you don't know what tool to use for what here is a table that basically give you a summary of that. You can check out older man if you want to work if you want to manage your local. Model. You can also check out LM Studio which is a UI driven local lms basically for frontend engineers. And then you can also check out continue the dev which is a vs. Code AI assistance. So you're already using VS code. You can plug this in as an extension. And then we have the 30 ml, which is a code completion alternative. That you can also leverage on. And we have the VLLM which solves LLMs effectively. So if you want to run your LLM locally, you can also leverage the VLLM implementation. So we have the launch chain as well, which is basically used to build intelligent apps. Yeah, so that's basically all these tools. Now let's take a brief look at setting up. An open source AI dev. Workflow. So far we've talked about why AI matters and what you can do with it now let's take a deep look into how all of this actually works with the open source AI workflow for developers. And the beauty of it is that you don't need a cloud subscription to get started, like I've been saying. So first, you are going to choose a runner. So it can be an old LA man. And LM Studio they're basically great. So you can pick this, and then you can then integrate them with an id using tools like Continue Dev or tab ml. And then from there you can expand. To configure this to match, your local model endpoints. And then you can also fine tune these for your own use case. Then you can have a prompt template basically for your stack if you're making use of next G'S view. Or any other stack or just be it like H-M-S-C-S-S. So depending on what you want to use this for, maybe for backend, for front end, you can basically craft this effectively. All of these runs on your machine, so you don't really have to like host it and then even make more sense if you are a solo developer working on a project. So you don't really have to worry about the downtime. Or the slow or the limit. So you basically own the environment and you can customize it to suit your workflow with prompts, template for any of your tasks. So now let's take a look at the performance and caution, because why open source LMS offer a lot of power and flexibility. It's also important to understand both the performance you can expect and then the cautions you need to keep in mind. So there's a table on the performance side. Majority of the models are, surprisingly fast and accurate already. And especially when you run them locally using optimized tools like VLL VLLM or like we've mentioned in the previous slide. And for most developer productivity tasks writing code. Writing tests, summarizing or generating docs, they are more than enough, right? So these tools are more than enough. And they are great tools for everyday day of workflow. And there's no vendor locked in. You can basically own these. You can also it whenever you want. It can live on your machine locally. Yeah. So some of the questions that you should take note is that AI can hallucinate which means that it can generate. Wrong or misleading outputs with confidence. So it might be blind to its own mistake basically. So that is why you must validate its suggestions. You must always review it. And then especially when you are working on sensitive logic, and you should remember that. If you are using tools or even local logs, you should be consigned and mindful of data privacy. Be mindful of sharing sensitive data, input like production API, keys, all this basically your own responsibility of taking care of data privacy. So don't just paste in API keys or internal code basis vs without understanding the risk. Take some time to learn how this LLM works, like what their code base looks like, what the community is saying about them. So this will help you to better understand how things works as well. So now that we've gone through the performance and caution of these lms, now let's take a look at best practices for productive AI usage. So before you jump in and wire LLMs into everything into your code base, because it can definitely be exciting. Here I feel best practices to get the most out of ai, especially as a software engineer. I. Which ones to run things locally. But first I would recommend that you start with re bottlenecks in your workflow. Don't just use AI just because it is cool. Identify the places the gap that AI can fill in your workflow. Use it where you are wasting the most time, like maybe every time. You want to set a new project, you need to set up some bullet plate or to develop a feature or ui component. You need to do some repetitive work see if AI can come in or if it is something you can optimize yourself even without using ai, but then identify where to use AI and where not to. So one of the best practices is to basically. Remember to prompt, thoughtfully write clear, intentional, prompt. You are pairing with a machine, so you have to be specific. AI shines when they have the right data. So if they don't have enough data through your prompt, then they might not give you an effective. Results or output that you expected. You basically get better results when you give better context, when you give good context samples clarity, and basically treating your prompting. Skip. This is going to assist you to make use of this LLM effectively and also validate everything. So this is very important. Always review the output. AI might be confident and still be wrong, right? You just want to ensure that you are also in the process and not blindly accepting. The changes. So always read and test before trusting the changes. And now don't share sensitive code. Be smart about privacy. We've discussed it earlier as well in the caution, so you want to ensure that you're not sharing. Sensitive data in the L lms because it can be exploited at some point. So you just have to basically be careful with the data that you are sharing with it as well. The next one is to basically pick two. Match your stack. And you have to think long term because the base developers are not just using tools, they're building systems around it. So you want to build a system that works for your stack in a very long time as a Python developer or as a JavaScript developer. So you want to ensure that your usage of AI is not for just short time, but it is for long-term use as well. With that to wrap this up, I will leave you with. This code, which says that AI is your turbo charger and not your brain. This means that AI is not just a cool tool anymore. It's fast becoming essential which is becoming a foundational part of how than engineers work smarter, ship faster and right more reliable code. We already having augmented engineers. So if you haven't already started using AI to support every one of these area in your workflow, even if it's just one, I think now is a perfect time to experiment with all of these open source lms. And the more we understand our AI fits into our workflow, the more strategic we can be about boosting our efficiency not just with tools. But with intentions. So try to spend some time to understand where these LMS can fit in, especially running it yourself and tuning it for your own use case. And then let a AI a. You move from idea to output faster, and now my challenge to you is to try just one open source LLM two this week and see how it changes your workflow. Leverage it, think about what it can solve for you, and then see how you can integrate this into your workflow. Okay? And that's it from me. Thank you so much for your time and intention and for staying to the end of this talk. I hope this session spark a few ideas and gave you something practical to take back into your workflow. I'm happy to chat more after this. Feel free to say hi or connect with me on LinkedIn on X. You can check out my YouTube or check out my blog and leave me a note as well. Until then, stay curious and keep building smarter. Bye-bye for now.
...

Ayodele Samuel Adebayo

Software Engineer @ Hashnode

Ayodele Samuel Adebayo's LinkedIn account Ayodele Samuel Adebayo's twitter account



Join the community!

Learn for free, join the best tech learning community for a price of a pumpkin latte.

Annual
Monthly
Newsletter
$ 0 /mo

Event notifications, weekly newsletter

Delayed access to all content

Immediate access to Keynotes & Panels

Community
$ 8.34 /mo

Immediate access to all content

Courses, quizes & certificates

Community chats

Join the community (7 day free trial)