Conf42 Chaos Engineering 2025 - Online

- premiere 5PM GMT

Revolutionizing Nonprofit Marketing Analytics with a Unified API Ecosystem for Real-Time Insights

Video size:

Abstract

Discover how a cutting-edge API ecosystem revolutionized nonprofit marketing analytics! Learn to integrate data from top media platforms, automate workflows, and deliver real-time insights. Empower your organization with actionable strategies to boost efficiency, decision-making in a digital world!

Summary

Transcript

This transcript was autogenerated. To make changes, submit a PR.
Hi, everyone. Good morning. Good evening. Hope you guys are doing good. This is the raw. Welcome back by another talk. today we were talking about how we can build API ecosystem for our organizations. Definitely this will help us a lot. again, welcome to this presentation on integration AI driven analytics and API ecosystem in nonprofit organizations. Let's explore how this framework combines advanced one on one analytics with multi platform data synchronization. We're going to talk about how we're going to bring data from different social media platforms into your existing database. so let's see. In today's world, there are a number of third party vendors who can provide their service to bring data from all the social channels like Facebook, LinkedIn, TikTok, YouTube, and all other platforms, because they have already predefined, scripts, the programs, right? So they will provide the services. it's. Eventually it's our data, so they use our, our credentials and whatnot, right? And, they're going to charge, yearly subscriptions, to accommodate all the data needs. So today I wanted to explain and provide, different ways, to avoid that, third party tools. And how you can inbuilt it. Your own platform data using this approach to bring into your, the databases, right? yes, definitely. there is a scripting is involved. and there are the technologies which we need to accommodate this process, which I'm going to discuss today. But definitely this is, Maintenance free. And, we don't have to pay, so many dollars, on yearly basis, right? If it is a one time, so maybe we can consider it. But what we need to do, we're going to end up paying every year for the same subscription. So this solution is what we're talking today, which will help us to bring your data into your database using these approaches. assume that, assume that you have the different data flows. for instance, the data you're looking for the Facebook, Instagram, LinkedIn, Twitter, TikTok, Meta, all that, all the social platforms, right? that it VA media or, organic social media, right? Yeah. what we are going to do if, as you see, my screen here, so we need the login details, right? So we are going to build. A script, with within the, shehar, or no JS N right. And then, we prepare that API and there are the two options we can accommodate to schedule that, the script, which or the a PA which we build, and that will bring the data and that convert into the CS three. And they'll place it to the secure location based on the, Based on the target location, which we described in our, the script, right? And then, yes. Okay. Let's say, there are a couple of scenarios. First, yes, we need the login details. The platform which you are going to access, right? And then, we need to prepare the script, with either Shehar or the, node Jason, with all that, a PE details and endpoint and all those things, which we are going to talk in a minute, huh. and then, we need. To prepare the script and it will automatically download the data from, the social platforms. It's not only social platform like any platform if you wanted to bring, which is APA supported, right? Eventually, these solutions will work on that, right? And, prepare that. The API and if you wanted to go with the window scheduler, or if you wanted to go to AWS Lambda, right nowadays, like many people are using the AWS services, right? So it's again, it's not that expensive. If you wanted to go to the AWS Lambda, it's relatively very cheap for the subscription, right? For the scheduling, but if you don't want to, if you don't have AWS services, it's okay, but still you can use the window scheduler right to trigger ever jobs. And then, I will talk about how to set up the, the scheduler also. So then, the schedule will run. And that will pick up the API, which you have that script and that script will talk to their source and then bring the data and put it into the CSV location. And from CSV to the database, so you can use the AWS Lambda, or maybe you can use another mechanism to the database. Then from there, your BI analytics come into the picture, right? So now how this is going to work, right? let's say, this is what, where we are going to talk a little bit details, right? as in that, as every a PA, which has their own end points and based on the KPA, which we are looking for. So we need to make sure, And go through that API documentation, whether that is supported. Yep. If yes, then what are the pros and cons? How we need to get the data. So all this, application, they have really, good API documentation. We just need to go through and get the data, right? for instance, we're talking about the, like maybe Sprout Social, right? Sprout Social or the Meta ads. So they have this process API. and they'll provide that endpoints matrix, and there are some other parameters, which we're talking here, right? So let's say if you're looking for, a customer ID or the topic ID. you need to go through that and get those details and you can pass those information into your script, right? So before we implement the script. what I do normally, right? I use the postman to make sure how it is responding, how the data is coming, right? My numbers are matching with the source or not, right? So that is the tip I can provide you here, right? so before we move to the script, right? And similarly, let's say if you're looking at the meta ads and yeah, metadata also has then not really a good API, in the documentation and they have some other parameters also. We talk meta app, id, meta pay inside fields, meta app pay inside, pay Inside Limit. And what not. And similarly, if you're looking at the Google Ads, they have that API documentation and the endpoints, and the Bing Ads, the Bing Ads, I would say it's a little tricky, but it's not that complicated. so they have a really good documentation. So there are other parameters. Let's say if you see Google Ads, right? The client ID, client secret, and let's token endpoint. So these are the other parameters which we need to consider, right? so once let's say, If you wanted to bring entire universe from your social channel, or if you're looking for the specific metrics you wanted to bring into your reporting platform, right? So that's where we need to, we need to have that final, the requirements, right? So if you wanted to bring, entire universe from the, source systems, yes, you can absolutely, consider that. And you need to prepare the script. ba based on the KPIs, we are looking for, so there are some key points which we need to consider, right? so how many KPIs we are bringing and, if that, that KPA, which we are looking for is supported by that a PA or not, so that, that is a first thing we need to do, right? there are some KPIs, which, we. Which, these platforms may not support, but what we need to do in that case, we need to raise a support ticket to the, source platform, team, and they will, provide those details. But if, as long as, if that KPI is existent, supported by the API, so you, and you have that credential in the API endpoints, you can go ahead and build it. Another thing what we need to consider is like the paginations, right? so we need to make sure, the paginations is rightly set up, right? then the other point, other important point is a majority of the social, platforms. So based on my observation, the mostly refresh. On, around nine, after nine, am EST, right? Like maybe if you schedule your, the scheduler, before nine o'clock 9:00 AM EST, so you may not see, up to date data. and again, right? this is based on the documentation on my experience. Maybe, if you have the different needs on that scheduling and how. How fresh data you are looking for. I recommend you to, go through the, the documentation and set your scheduling accordingly, right? the same approach will applicable for all other the platforms, being it's a tick tock or, LinkedIn or the Twitter, right? only thing on the Twitter side, the sentiment for the Twitter, we need to focus on that. if you have the requirement to bring the sentiment from the, X platform. So you need to make sure whether that X platform is support, the sentiment or not. Yeah. these are the findings, which I would like to share a value or in a connecting with the API. so then once we build the. so what do we need to do? we need to prepare the Windows scheduler to bring the data on schedule basis, right? because you don't want to do the manual every day, right? so that's when the scheduler will help us. It is a Windows inbuilt, the tool and many, many People, you know, using it, but it's pretty, pretty straightforward to set up the windows scheduler. so you're going to find the windows, but the only thing is that you should have the admin privileges to utilize the scheduler, the features, right? so open the windows scheduler from, from your start. program, and, and, they need to create that task and look for the, the script, which are the API, which we developed to bring the data. For instance, I like the Facebook, right? If we prepare that API, using the mechanism, which we talked before. and place it into your, C drive or D drive, where, wherever you want it to utilize that program automatically. And, the same mission, like use the window scheduler, open the window scheduler and go to the task and create the, just give it a name. And, you need to set A few things they're like now browse that file of the APA, which we created the Facebook for instance, right? And look for that and the set the refresh timing and give that target. let's say if you wanted to bring it to the article or, Amazon redshift. So create the same structure how you have. In your file, so that way, this file, the new scheduler will, refresh and load the data into the same target file. So you need to make sure you define the target file as well. So only thing you need to consider here is when you're creating the target file, make sure, data types and the sequence, all that in the same as you were, in your script. So then, when you set up this, your scheduler, make sure to run two, three test runs, with a small amount of data, and, and then that's it, right? once you have the, the, scheduler runs and create the data aggregate file, it will go and roll into the table. I just wanted to repeat, one more time. Just, first thing we need to have the valid, user login details. And the second thing, we need to make sure what KPIs, which we are targeting to bring into your, the database. And the third thing is that KPIs are support. by API or not, right? That is another thing which we need to do it. So once we pass all these things, the next thing is, try to, work on the postman, So that way, If everything is working fine, or if you see any hiccups and whatnot, then, you need to create that package, what we call it as, the package of the API and bring that package into your local system. And then the next step is, create the scheduler, and, before you enable the scheduler, make sure you create the target table, right? In the scheduler, all what we need to do is use that package and define the target and define when the scheduler has to be run, right? So once you set up all this, you're And run it. So your data will come and sit in your, the database. Yeah, so that's pretty simple. So this will work for all other applications. so that way, we don't have to opt, other third party tools and paying so much of dollars in, on a yearly basis. So this is our data. And yes, the technologies we need, one we need, either C sharp, or, nor Jason, that, and then, the window scheduler is, I think pretty straightforward and little bit, the CSL knowledge. another key point is, you need to have that admin privileges to say, to enable the scheduler or they create the table in the database. Yeah. So if you have all these things, you can go to start. And now let's talk about the impact of digital transformation in nonprofits. If you see the leading organizations successfully adapted to the AI and cloud based solutions for donor management is 78. 2%. which is really, the good number and efficiency improvement. Yeah. So greater productivity through automated workflow and digital tools. We see, around 56. 2 percent and, no overhead deductions. It's closely close to 43. 8%. Engagement boost is 67. 4%. And the API integration, right? If you see the API integration, that's where we are almost in a 62. 7%. And a cloud infrastructure. Yes, I would say, it's 45. 5%. API, AI driven analytics, revolution, donor relationships, private, donor retention or, annual donations, predictions and accuracy, mission learning models. So all this will help us to, all that AI driven analytics for donor management system and, a little bit, on the architecture level, like now, it's a high volume data processing, right? It's because, based on, on the data, what we were using, right? Our robust infrastructure handles, for almost 4. 7 TB of donor monthly and enabling real time decisions making across all the platforms. And enterprise media intelligency, it's industry leading, which if you see it's close 99. 9%, which is pretty high, right up to while processing 1. 2 million daily media mentions, ensuring comprehensive brand monitoring. And there is a comprehensive social analytics, the advanced processing 2. 3 millions, cross platform social interaction, delivering actionable engagement insights. Performance, marketing analytics, multi cloud, redundancy, advanced executive framework. Yeah, this all will, consider as a, our system architecture overview. we'd like to, spend a little time on, AWS Lambda infrastructure performance. I do see, the Lambda, performance also really good, right? the enterprise grid reliability is 99. 95%. And, lightning fast response is 95%. Scalable performance is really good. dramatic cost saving, I do see, really, good number over there. The advances are, Serverless architecture, leveraging AWS Lambda, event driven compute model, it's dynamically allocating up to, 10 GB memory, and executing functions with, almost 99. 9 percent availability across multiple global regions, and scalable performance, multi region, resilience, yeah, so all this is, considered as AWS Lambda infrastructure performance. So then, let's talk about. AI Powered Donor Analytics Platform, it's a 360 degrees donor view, processing, as I mentioned, like the 4. 2 TB of donors information daily equivalent to 2. 1 million donor profiles, which is pretty high. Predictive Analytics Engine, yes, it is analyzing, 3. 5. 3. 2 million donor behavior patterns daily with 94 percent accuracy, right? 94 percent accuracy is really, it's a good percentage. The advanced research infrastructure, executing, 234 complex queries daily with a lightning fast 98 ms response time. a simple neutral networks with 96. 5 predictive accuracy across, diverse donors segment. So real time data processing, yeah. So the, this enterprise is great platform. Harm is cutting edge. Aye. And distributed computing technology to, to transform nonprofit operations by processing over 8. 7 million donor interaction monthly across, 2345 organizations. And now if you're looking to 360 degree donor view system, this is really impactful for any organizations when you have a 360 degree donor view. exactly the donor, right? where he is, how much healthy he is, how much he's donating, where else he's donating. and all that information, right? the instant social, intelligency. And the precision error optimization, a data driven reliability. yeah, so this all will help us, you know, to know better about your donor. And the more we know about the donor, the more we can, get the funds, from the donors and some predictive analytics engines, right? Advanced data processing, Intelligent ad integrations, dynamic donor profiling, assembling machine learning, architecture, real time data processing. So if we consider all these things, this really, help us, to have that, really, good productivity. They're the most effective analytics engine established in our organizations. Advances such infrastructure. so this is nowadays, this really helped us a lot to understand more, right? the light, lightning, fast processing, human, understanding robust, add intelligency, holistic sets, scalability, right? so this will really, help us advance such infrastructure. digital analytics and API ecosystem, the core platforms, like there are like, seven, systems we can talk about the core platform components, implementation architecture, operational consideration, restful API integration, scalability framework, compliance and security layer, and, performance telemetry. At the conclusion, transformative potential, right? Our revolution digital analytics framework has already demonstrated extraordinary impact with participating non profits seeing an average 47 percent of increase in donor engagement and a 42 percent improvement in retention rate by seamlessly integrating AI driven analytics across multiple platforms and processing over 8. 7 million monthly donor infrastructure. So we are empowering organizations to transform data into meaningful relationships and measurable results. Yeah, so with that, so establishing this, API ecosystem to bring the multiple data, multiple, social. Platform data into your, your database now without relying on the third party tools. it's really helped. one, you can do a lot of cost saving and to, it's your data. You are owning your data on your understanding the data. and three, you can do, really, good analytics, out of it. Thank you so much, for your time. Hope, hope you have, something, useful, for your organization today. Thank you so much.
...

Rao Marapatla

Director, Data Infrastructure and Reporting @ BreakthroughT1D

Rao Marapatla's LinkedIn account



Join the community!

Learn for free, join the best tech learning community for a price of a pumpkin latte.

Annual
Monthly
Newsletter
$ 0 /mo

Event notifications, weekly newsletter

Delayed access to all content

Immediate access to Keynotes & Panels

Community
$ 8.34 /mo

Immediate access to all content

Courses, quizes & certificates

Community chats

Join the community (7 day free trial)