[No-code] ML using Amazon SageMaker Canvas [Open in app]( or [online]()
[Data-Centric AI in GPT Models, OpenAI API in Code & argparse in ML workflows]( [No-code] ML using Amazon SageMaker Canvas Mar 31
[Save](
[▷ Listen](
👋 Hey, "XAI is a critical area of research for building safe and trustworthy AI systems. Without it, we risk creating black box systems that we can't understand or control." [- Timnit Gebru, Former Co-Lead of the Ethical AI Team at Google.]( Explainable Artificial Intelligence (XAI) is crucial in the era of GPT because it provides transparency and accountability for AI systems. As GPT models grow larger with more complexity, XAI techniques can help ensure that their decision-making processes are understandable and fair, reducing the risk of bias and errors. This week we focus on resources to improve efficiency in data and ML practices by eliminating time-consuming tasks. The goal is to streamline processes and make data management more efficient. Key Highlights: - [Data-Centric AI Concepts behind GPT Models]( - [Open Source ChatGPT Models: A Step-by-step Guide]( - [Calling the OpenAI API in Code]( If you’re interested in sharing ideas to foster the growth of this professional data community, then this survey is for you. Consider sharing your thoughts and get a FREE bestselling Packt book, The Applied Artificial Intelligence Workshop as PDF. Jump on in! [TELL US WHAT YOU THINK]( Cheers,
Merlyn Shelley
Associate Editor in Chief, Packt Latest Forks on GitHub GPT - [KanHarI]( Summarizes the changes introduced by a pull request in a repository. - [acheong08]( Reverse-engineered API of Microsoft's Bing Chat AI. - [arc53]( GPT-powered chat for documentation search & assistance. XAI - [ModelOriented]( It enables you to explore and explain the behaviour of any model. - [EthicalML]( Machine Learning library that is designed with AI explainability at its core. - [salesforce]( One-stop explainable AI library for data scientists, and researchers. [Pledge your support]( Industry Insights AWS ML - [[No-code] machine learning using Amazon SageMaker Canvas:]( Canvas simplifies the process of applying ML to business problems for analysts who don't need to know technical details. It offers preview analysis for estimated accuracy and feature influence, and two methods for training ML models: Quick Build and Standard Build, with complete transparency. With Canvas, users can achieve business outcomes with ML without writing code. - [[AWS AI/ML services] to enhance the use of evaluation to support progress toward the Sustainable Development Goals:]( The [United Nations Development Programme]( (UNDP), in collaboration with AWS developed a machine learning platform called Artificial Intelligence for Development Analytics (AIDA) to aid evaluators in searching for relevant content accurately and efficiently. By using Amazon Textract, paragraph-level evidence extraction accuracy improved from under 60% to over 80%, and multi-label classification accuracy increased from under 30% to 90% by retraining models in Amazon Comprehend with improved training datasets. Google Cloud - [[Run AlloyDB] anywhere - in your data center, your laptop, or in any cloud:]( [AlloyDB Omni]( a technology preview, is a downloadable edition of AlloyDB that can run on-premises, at the edge, across clouds, or on developer laptops. It offers high performance, PostgreSQL-compatibility, and Google Cloud support, making it a cost-effective alternative to expensive legacy databases for enterprises. - [[Gen App Builder] Build new generative AI powered search & conversational experiences:]( Google Cloud's Gen App Builder enables developers to leverage Google's [generative AI]( capabilities and create enterprise-grade AI applications without requiring extensive machine learning skills. The platform offers an orchestration layer that simplifies the process of integrating generative AI tools with enterprise systems for a seamless user experience. Just for Laughs! What do you call a conversation between GPT and XAI? A debate between the generator and the explainer! Understanding Core Concepts Calling the OpenAI API in Code – By [Steve Tingiris]( When you're working with new technologies, good documentation is very often not available. Fortunately, that's not the case with GPT-3. The OpenAI documentation is extremely well done. It's complete, easy to follow, and provides a number of very useful examples. Here, you'll learn how to start using the OpenAI API in code. For this exercise, you will need to have access to the OpenAI API. You can request access by visiting the following URL: . The OpenAI API is a standards-based API that can be used with almost any modern programming language. We will start by creating a repl for Python. Creating a repl for Python From the home screen, click on the + New Repl button: - Choose Python from the repl type dropdown. - Name the repl exploring-gpt3-python. - Click the Create repl button. Setting your OpenAI API key as an environment variable In replit, you can save environment variables by clicking on the padlock icon (Secrets) in the navigation pane and adding a name and value pair. You'll need to do this for each repl you're working with – so, one in your exploring-gpt3-node repl and/or in your exploring-gpt3-python repl. To add your OpenAI API Key as an environment variable, do the following: - Open your repl. - Click the padlock icon (Secrets) in the middle of the navigation pane - In the Key input textbox, add OPENAI_API_KEY. - In the Value input text area, paste in your OpenAI API. - Click the Add new secret button. Understanding and creating the .replit file In replit, by default, when you click the Run button the main.py file will be run for Python repls. However, you can create a file named .replit and change the code file that will be executed by the Run button. As we work through different examples, we'll be creating and testing code in multiple files. So, we're going to need a .replit file. To create a .replit file, do the following: - Open your repl. - Click the add file icon on the top-right side of the navigation pane. - Name the file .replit. - Add the following text to the first line of the .replit file: - For Python repls: - run = "python main.py" Using the OpenAI API in Python We'll start by setting up a new folder for the Python code, then, we'll add a file and update the .replit file so the replit Run button executes our new file. Here are the steps to follow: - Open your exploring-gpt3-python repl. - Click the Add folder icon in the top right of the navigation pane. - Edit the .replit file so the Run button executes engines.py using the following text: run = "python engines.py” import requests import os apiKey = os.environ.get("OPENAI_API_KEY") headers = { 'Authorization':'Bearer ' + apiKey } result = requests.get(' ,headers=headers) print(result.json()) This curated content was taken from the book [Working with the OpenAI API | Exploring GPT-3 (packtpub.com).]( To learn more, click on the button below. [SIT BACK, RELAX & START READING!]( Find Out What’s New? - [[Open Source] ChatGPT Models: A Step-by-step Guide:]( This article covers the technical aspects and steps involved in using two open-source models, Alpaca and GPT4All. It aims to help data practitioners effectively use these models for their desired applications. Both models are available for free usage and do not require the paid OpenAI API. - [[Argument Parsing] for Greater Efficiency in Machine Learning Workflows:]( Here you will get to know how to use [argparse]( for command-line applications and machine learning projects. Argparse is a Python library for parsing command-line arguments, which provides flexibility for configuring and customizing program behaviour. This article covers core functionalities with examples, demonstrating how to efficiently use argparse for Python applications. - [[Distributed] Hyperparameter Tuning in Vertex AI Pipeline:]( article showcases how to integrate distributed hyperparameter tuning in the GCP Vertex AI pipeline, which provides an easy way to create machine learning workflows from data collection to endpoint monitoring. The article fills the gap in the existing [Vertex AI pipeline]( by demonstrating how to integrate the GCP HPT module. - [[Data-Centric] AI Concepts behind GPT Models:]( Large Language Models (LLMs) like [ChatGPT]( [GPT-3]( and [GPT-4]( have made significant progress in tasks like language translation, text summarization, and question-answering. These AI techniques can also perform tedious data science tasks more efficiently, such as data processing, cleaning, and even generating data for training. You can find relevant data-centric AI resources on their regularly updated [GitHub repo](. See you next time! As a GDPR-compliant company, we want you to know why you’re getting this email. The _datapro team, as a part of Packt Publishing, believes that you have a legitimate interest in our newsletter and its products. Our research shows that you opted-in for email communication with Packt Publishing in the past and we think your previous interest warrants our appropriate communication. If you do not feel that you should have received this or are no longer interested in _datapro, you can opt out of our emails by clicking the link below. [Like](
[Comment](
[Share]( Read Packt DataPro in the app
Listen to posts, join subscriber chats, and never miss an update from Packt SecPro.
[Get the iOS app]( the Android app]( © 2023 Copyright © 2022 Packt Publishing, All rights reserved.
Our mailing address is:, Packt Publishing
Livery Place, 35 Livery Street, Birmingham, West Midlands B3 2PB
United Kingdom
[Unsubscribe]() [Start writing]()