A selection of the top articles and videos from the last week on SystemsDigest.com. Don't forget to check back regularly for daily updates from around the globe.
If you like SystemsBuzz, why not forward it to some friends or share the online version?
Getting data *into* your database is easy, but querying large datasets is challenging—especially without the right indexes. Pavel Tkachenko teaches how to write performant SQL queries with EXPLAIN and ANALYZE.
As developers, one of the most important things we can consider when designing and building applications is the ability to know if our application is running in an ideal operating condition, or said another way: the ability to know whether or not your application is healthy. This is particularly important when deploying your application to Kubernetes. Kubernetes has the concept of container probes that, when used, can help ensure the health and availability of your application. In this post, we’ll explain what container probes are, when and why you should use them, and discuss TCP probes and why you should consider not using them to improve the quality of your application.
The NoCredentialsError is an error encountered when using the Boto3 library to interface with Amazon Web Services (AWS). Specifically, this error is encountered when your AWS credentials are missing, invalid, or cannot be located by your Python script. These credentials are stored by default at ~/.aws/credentials which contains your access key and secret access key for using AWS services, along with other configuration details such as your region code. Think of this file as your login and password for the service.
With Ashish Khandelwal, Mainframe Modernization Engineer at Microsoft, Mukesh Kumar, Principle Group Engineering Architecture Manager at Microsoft, and Tom Griggs, Global Partner Senior Manager at Qlik
The sequel to "How to measure the success of data teams," in which Montreal Analytics explains how to evaluate the performance of individual contributors in a data team.
Python is insanely popular among machine learning enthusiasts these days. Hence, anyone developing a machine learning model normally turns to Python. The real challenge arises at the deployment stage because you can use many frameworks. Figuring out which Python framework to use may add to the confusion. This post discusses two popular machine learning frameworks, Flask and Django. We’ll also compare them side by side, so that you can make the right choice. If you are also stuck in the deployment stage, hop in because this post is for you.
Developers frequently choose Ably for building chat applications or to provide chat functionality in their products. While dev teams in different companies have different priorities, overall our customers tell us that using our serverless WebSockets platform frees them to fully focus on delivering the best possible core user experience. Just like our customers, we too like to keep our users happy. So we’ve set out to make it easier and faster for devs to add additional functionality in the 1:1, group or large-scale chat apps they build on Ably.
Stuck manually updating data for that Excel report? There is a better way to spend your time at the office. Automated data processing can do the heavy-lifting for you and liberate your schedule for more productive and creative work.
It’s always great to build something that makes money. The most successful businesses often find the easiest and most efficient ways to make money, while keeping costs and support to a minimum. After all, the best businesses and products are simply the ones that know how to build revenue. Many companies now look to monetizing their APIs as part of their overall monetization strategy. API monetization isn’t always easy though. It generally takes a lot of integrations, a fair amount of code and customization, and can also lead to a large support burden. This is especially true when billing issues arise. In short, there are challenges both during implementation, and once the billing system is up and running.
With cybercrime continuing to grow at an alarming rate and cybercriminals getting increasingly clever about how they get their hands on your precious data, API authentication is more important than ever. If you’ve ever logged into an app or website using your Facebook or Google account, then you’ve used API authentication. APIs are the backbone of the internet. They allow disparate systems and login pages to communicate, exchanging user data and triggering actions. But with great power comes great responsibility, and APIs must be properly secured to prevent misuse.
Machine learning is used across industries and user communities for a wide variety of predictive analytics needs – use cases ranging from sales forecasting to churn reduction, customer lifetime value, inventory optimization, capital allocation and more.
According to Cynerio’s 2022 State of Healthcare IoT Device Security Report, 53% of internet-connected medical devices analyzed had a known vulnerability, while one-third of bedside devices had a critical risk. (Cynerio observed over 10 million medical devices at over 300 hospitals and medical facilities across the world). As per the findings, more than half of health enterprise-connected medical devices pose security risks due to critical flaws that could jeopardize patient care. Given this scenario, software testing, ensuring that a device and its potential hazards are minimized, should be a top priority for device manufacturers.
In this post, we'll dive into ractors in Ruby, exploring how to build a ractor. You'll send and receive messages in ractors, and learn about shareable and unshareable objects. But first, let's define the actor model and ractors, and consider when you should use ractors.
In the first part of this series on maintainable Elixir code, we started by applying rules for code predictability in our code design. We immediately saw an emerging pattern: a series of transformations on state. In this part, we'll explore this pattern further. We'll first learn how to write expressions as reducers. Then we'll use metaprogramming to make use of reducers and enforce code style seamlessly. Finishing up, we'll see an example where all the pieces fit together. Let's get going!
Every organization knows what quality should look like for the products they work on. However, what’s the best way of knowing whether the obtained quality compares to the expected quality of the product(s)? Quality can be both quantitative and qualitative. Many questions could be asked to determine quality. For example, "How do you feel about the product you use? How easy is it to use? Does it behave as expected? Do customers keep coming back to use our product? Do customers like how the product looks? Does the product behave as it was intended to by design?"
Here at Appian we're really excited to have released our low-code data security feature earlier this year. If you're not sure what that is, you're in good company. Even our parents just look at us blankly before saying "that's nice, dear" and changing the subject. Someone who doesn't love us unconditionally might be more skeptical.
Qlik’s entry at the Gartner Analytics and BI Bake-Off 2022 looked to address the big questions around clean energy and climate change and found some surprising insights.
Over the past week or so, I’ve been working on updating our Developer Workshop content. One of the trickiest parts of running workshops is the differences in local environment configuration: some attendees have a Mac, others windows, some with admin permissions, and some without. So much depends on what your company provides and how they manage their systems. To make things easier, I’ve been relying on CodeSandbox to eliminate a lot of the unknown.
Lead time is a crucial metric in software development; it measures the time between the allocation of the work through code commit and finally to production. For us, lead time is essentially the time from backlog to the app store, or from production to the app store.
This has been the center of countless dialogues and debates for decades now. Some believe that institutionalizing innovation means conducting hackathons, investing in research and startups and promoting open office seating, and so on. Others say that innovation is about investing in new technologies like Artificial Intelligence (AI), Machine Learning (ML), Blockchain, Intelligent Automation, or embracing new business models. Enterprises spend a lot of capital to identify paths of innovation, innovation paradigms, and innovation models. The answer may be these.
Application programming interface (API) integrations can be considerable cost savings for your business. Integrating with an API eliminates the need to develop and maintain custom integrations for each application or system you use. However, you should be aware of some costs associated with API integrations before you decide to integrate. This article will discuss the different types of expenses related to API integrations and how to minimize those costs. Calculate The Cost Of Developing APIs From Scratch Calculate yours!
Today it’s not unusual to see organizations having implemented mocking in their daily workflow, as mock APIs allow developers to speed up their development and not rely on external services. For those reasons and others, many engineers are looking to learn more about the mocked APIs and how they can best be implemented into their organization.
Data visualization tools are dedicated software applications or components of a business intelligence (BI) solution that visually render data and present information through various formats, like graphs, charts, or heat maps - and an important parts of analytics today. The best data visualization tools are those that provide a diverse range of visualization options that represent complex data across multiple use cases, and help end-users easily analyze it. Data analytics is the process of analyzing raw data, used to examine all sorts of data, including real-time, historical, unstructured, structured, and qualitative.
Here are five things to know about data observability: Data observability solves many of the issues of modern data infrastructure. Still, few Ecommerce organizations understand this process or how to improve it. Here's what you need to know: Data observability, in a data science context, helps you understand the current state of all the data in your Ecommerce enterprise. It monitors and manages any problems that might occur during the data integration process. It helps you make better data-driven decisions from better business insights.
Next.js is an open source React web development framework built on top of Node.js. The Next.js framework is known for: With the help of the NextAuth.js library, I will explain how to authenticate a user in a Next.js application with WSO2 Identity Server. NextAuth is an open source authentication solution for Next.js applications. More information about NextAuth.js is available here. I'll use my existing Next.js application to show how NextAuth.js integrates with WSO2 Identity Server and user authentication flow. You can find the Next.js template I used to create the above sample application here.
Amazon's Elastic Beanstalk makes it easy to deploy and scale your applications with load balancing, health monitoring, and auto-scaling. In this tutorial, you'll learn how to deploy a Node JS application with AWS Elastic Beanstalk.
We received a lot of questions during our Build Pipelines webinar in July, so in this article, you'll read part 1 of the Q&A about Build Pipelines and rerunning workflows, stages, or pipelines.
The COVID-19 pandemic has changed how businesses in all industries and geographical areas conduct their business for years. A recent McKinsey Global Survey of executives revealed that organizations had sped up the digitalization of their internal processes and supply-chain interactions by three to four years. Additionally, their portfolios’ proportion of digital or digitally enabled items has increased startlingly by seven years. It takes new strategies and practices to be competitive in this changing business and economic environment.
OpenTelemetry is a free and open-source software initiative with the objective of supplying software developers with the means to create distributed systems. OpenTelemetry was developed by engineers at Google, and developers have the ability to utilize it to create a standard foundation for the construction of distributed systems. The goal is to enable developers to write code once and then deploy it in any location of their choosing. There is no need to be concerned with the various operating systems and languages.
The use of APIs has rapidly been on the rise over the last several years. In fact, data shows that nearly 90% of developers are using APIs and REST APIs in some capacity. APIs or application programming interfaces refer to functions that allow applications to access data and interact with external software components, operating systems, and microservices. Essentially, the main goal of an API is to enable multiple applications to communicate with one another as APIs. While APIs are great tools for software development purposes, they are also often the weakest link when it comes to a business operation’s cybersecurity.
Since 2016, the Stack Overflow Developer Survey has named the Rust language, also known as Rustlang, the "most loved programming language". It is one of the most highly regarded modern programming languages in the world. The syntax is quite similar to C++, but it has extra features like memory safety that can make your life much simpler and more secure. Only about 7% of respondents to the stack overflow survey claimed to use the language, despite it having a strong reputation in the development community. As a general-purpose language, Rust is supported by several frameworks that enable you to create almost anything that can be done in a language, including websites, games, and GUIs.
Guest post by Bill Inmon “Bill Inmon is an American computer scientist, recognized by many as the father of the data warehouse. Inmon wrote the first book, held the first conference, wrote the first column in a magazine, and was the first to offer classes in data warehousing.” Source: Wikipedia. An article headline — “BIG DATA/AI BLAME FAILURES ON BAD DATA” — caught my attention. Certainly, it is true that Big Data and artificial intelligence (AI) have not lived up to their hype. In fact, the hype was so great that perhaps no discipline or technology could have lived up to what was sold and promised.