Trending

#AmazonSageMaker

Latest posts tagged with #AmazonSageMaker on Bluesky

Posts tagged #AmazonSageMaker

Preview
SageMaker HyperPod now supports gang scheduling for distributed training workloads Amazon SageMaker HyperPod task governance now supports gang scheduling, which ensures all pods required for a distributed training job are ready before training begins. Administrators can configure gang scheduling to prevent wasted compute from partial job runs and avoid deadlocks from jobs waiting for resources. Data scientists running distributed AI/ML training jobs on Amazon SageMaker HyperPod clusters using the EKS orchestrator require multiple pods to work together across nodes with pod-to-pod communication. When some pods start but others do not, jobs can hold onto resources without making progress, block other workloads, and increase costs. Gang scheduling resolves this by monitoring all pods in a workload and pulling the workload back if not all pods are ready within a set time. Pulled-back workloads are automatically requeued to prevent stalling. Administrators can adjust settings on the HyperPod Console, such as how long to wait for pods to be ready, how to handle node failures, whether to admit workloads one at a time to avoid deadlocks on busy clusters, and how retries are scheduled. This capability is currently available for Amazon SageMaker HyperPod clusters using the EKS orchestrator across the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), and Asia Pacific (Tokyo), Asia Pacific (Jakarta), Europe (Frankfurt), Europe (Ireland), Europe (London), Europe (Stockholm), Europe (Spain), and South America (São Paulo). To learn more, visit SageMaker HyperPod webpage, and HyperPod task governance documentation.

🆕 SageMaker HyperPod now supports gang scheduling for distributed training, ensuring all pods are ready before starting, preventing resource deadlocks, and minimizing wasted compute. Available in multiple AWS regions.

#AWS #AmazonSagemaker

0 0 0 0
SageMaker HyperPod now supports gang scheduling for distributed training workloads Amazon SageMaker HyperPod task governance now supports gang scheduling, which ensures all pods required for a distributed training job are ready before training begins. Administrators can configure gang scheduling to prevent wasted compute from partial job runs and avoid deadlocks from jobs waiting for resources. Data scientists running distributed AI/ML training jobs on Amazon SageMaker HyperPod clusters using the EKS orchestrator require multiple pods to work together across nodes with pod-to-pod communication. When some pods start but others do not, jobs can hold onto resources without making progress, block other workloads, and increase costs. Gang scheduling resolves this by monitoring all pods in a workload and pulling the workload back if not all pods are ready within a set time. Pulled-back workloads are automatically requeued to prevent stalling. Administrators can adjust settings on the HyperPod Console, such as how long to wait for pods to be ready, how to handle node failures, whether to admit workloads one at a time to avoid deadlocks on busy clusters, and how retries are scheduled. This capability is currently available for Amazon SageMaker HyperPod clusters using the EKS orchestrator across the following https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), and Asia Pacific (Tokyo), Asia Pacific (Jakarta), Europe (Frankfurt), Europe (Ireland), Europe (London), Europe (Stockholm), Europe (Spain), and South America (São Paulo). To learn more, visit https://aws.amazon.com/sagemaker-ai/hyperpod/, and https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-hyperpod-eks-operate-console-ui-governance-tasks-gang-scheduling.html.

SageMaker HyperPod now supports gang scheduling for distributed training workloads

Amazon SageMaker HyperPod task governance now supports gang scheduling, which ensures all pods required for a distributed training job are ready before training begins. Administrators can ...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker adds serverless workflows to Identity Center domains Amazon SageMaker Unified Studio now supports Serverless Workflows in Identity Center domains.  With this launch, customers using Identity Center domains can orchestrate data processing tasks with Apache Airflow (powered by Managed Workflows for Apache Airflow) without provisioning or managing Airflow infrastructure. Serverless Workflows were previously available only in IAM-based domains.  Serverless Workflows automatically provision compute resources when a workflow runs and release them when it completes, so you only pay for actual workflow run time. Each workflow runs with its own execution role and isolated worker, providing workflow-level security and preventing cross-workflow interference. With Serverless Workflows, Identity Center domain customers also get access to the Visual Workflow experience with support for around 200 operators, including built-in integration with AWS services such as Amazon S3, Amazon Redshift, Amazon EMR, AWS Glue, and Amazon SageMaker AI. Serverless Workflows in Identity Center domains are available in all AWS Regions where SageMaker Unified Studio is supported. To learn more, visit the Serverless Workflows documentation.

🆕 Amazon SageMaker Unified Studio now supports Serverless Workflows in Identity Center domains, enabling data processing orchestration with Apache Airflow without infrastructure management, available in all supported AWS Regions.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker adds serverless workflows to Identity Center domains https://aws.amazon.com/sagemaker/unified-studio/ now supports Serverless Workflows in Identity Center domains.  With this launch, customers using Identity Center domains can orchestrate data processing tasks with Apache Airflow (powered by https://aws.amazon.com/managed-workflows-for-apache-airflow/) without provisioning or managing Airflow infrastructure. Serverless Workflows were previously available only in IAM-based domains.  Serverless Workflows automatically provision compute resources when a workflow runs and release them when it completes, so you only pay for actual workflow run time. Each workflow runs with its own execution role and isolated worker, providing workflow-level security and preventing cross-workflow interference. With Serverless Workflows, Identity Center domain customers also get access to the Visual Workflow experience with support for around 200 operators, including built-in integration with AWS services such as Amazon S3, Amazon Redshift, Amazon EMR, AWS Glue, and Amazon SageMaker AI. Serverless Workflows in Identity Center domains are available in all https://docs.aws.amazon.com/sagemaker-unified-studio/latest/adminguide/supported-regions.html. To learn more, visit the https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/serverless-workflows.html.

Amazon SageMaker adds serverless workflows to Identity Center domains

https://aws.amazon.com/sagemaker/unified-studio/ now supports Serverless Workflows in Identity Center domains.  With this launch, customers using Identity Center domains can orchestrate data processin...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio adds notebook import/export and developer acceleration features Amazon SageMaker Unified Studio notebooks now support import/export capabilities, enabling migration from JupyterLab and other notebook platforms. This release also introduces developer acceleration features including cell reordering, keyboard shortcuts, cell renaming, and multi-line SQL support, designed to enhance productivity for data engineers and data scientists professionals working with notebook-based workflows. The new import/export functionality supports .ipynb, .json, and .py formats while preserving cell types, outputs, execution history, and metadata, making platform migration straightforward.. You can export notebooks in four formats including Jupyter notebook with requirements (.zip), standard .ipynb, Python scripts (.py), and SageMaker Unified Studio native format (.json). Developer acceleration features enable you to reorder cells without copy-paste duplication, assign custom names to cells for improved navigation in large notebooks, use familiar keyboard shortcuts for faster development, and execute multiple SQL statements in a single cell with results displayed in separate tabs for easy comparison and analysis. This feature is available in all AWS Regions where Amazon SageMaker Unified Studio is available. To learn more, visit the Amazon SageMaker Unified Studio marketing page and user guide.

🆕 Amazon SageMaker Unified Studio boosts productivity with notebook import/export, developer tools, and support for.ipynb, .json,.py formats, cell edits, and multi-line SQL. Available worldwide. For more, visit the Amazon SageMaker Unified Studio page.

#AWS #AmazonSagemaker

1 0 0 0
Amazon SageMaker Unified Studio adds notebook import/export and developer acceleration features Amazon SageMaker Unified Studio notebooks now support import/export capabilities, enabling migration from JupyterLab and other notebook platforms. This release also introduces developer acceleration features including cell reordering, keyboard shortcuts, cell renaming, and multi-line SQL support, designed to enhance productivity for data engineers and data scientists professionals working with notebook-based workflows. The new import/export functionality supports .ipynb, .json, and .py formats while preserving cell types, outputs, execution history, and metadata, making platform migration straightforward.. You can export notebooks in four formats including Jupyter notebook with requirements (.zip), standard .ipynb, Python scripts (.py), and SageMaker Unified Studio native format (.json). Developer acceleration features enable you to reorder cells without copy-paste duplication, assign custom names to cells for improved navigation in large notebooks, use familiar keyboard shortcuts for faster development, and execute multiple SQL statements in a single cell with results displayed in separate tabs for easy comparison and analysis. This feature is available in https://docs.aws.amazon.com/sagemaker-unified-studio/latest/adminguide/supported-regions.html. To learn more, visit the https://aws.amazon.com/sagemaker/unified-studio/notebooks/ and https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/export-share-notebooks.html. 

Amazon SageMaker Unified Studio adds notebook import/export and developer acceleration features

Amazon SageMaker Unified Studio notebooks now support import/export capabilities, enabling migration from JupyterLab and other notebook platforms. This release also introduce...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio adds Observability for AWS Glue jobs via CloudWatch metrics Amazon SageMaker Unified Studio adds Observability for jobs, it now displays Amazon CloudWatch metrics for AWS Glue jobs directly alongside job logs in a single, unified interface. This enhancement adds observability to SageMaker Unified Studio, enabling data engineers and ETL developers to streamline their troubleshooting processes. With this feature, teams can diagnose performance issues faster by correlating resource utilization patterns—including DPU utilization, memory consumption, CPU load, and data movement size—directly with job log output. Specific use cases include identifying compute bottlenecks, detecting memory pressure or out-of-memory conditions, optimizing resource allocation, and monitoring data pipeline performance at scale. By consolidating metrics and logs into one workspace, organizations can significantly reduce mean time to resolution (MTTR) for ETL pipeline issues and improve overall operational efficiency. This feature is available in all AWS Regions where Amazon SageMaker Unified Studio is generally available. To access CloudWatch metrics, navigate to any Glue job in SageMaker Unified Studio, open a previous job run, and select the Metrics tab to view comprehensive performance data. To learn more about Amazon SageMaker Unified Studio and this new capability, visit the SageMaker Unified Studio page and see the documentation.

🆕 Amazon SageMaker Unified Studio adds observability for AWS Glue jobs via CloudWatch metrics, speeding up troubleshooting and optimizing performance by linking resource use with job logs in one interface. Available globally, it cuts mean time to resolve ETL issues.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker Unified Studio adds Observability for AWS Glue jobs via CloudWatch metrics Amazon SageMaker Unified Studio adds Observability for jobs, it now displays Amazon CloudWatch metrics for AWS Glue jobs directly alongside job logs in a single, unified interface. This enhancement adds observability to SageMaker Unified Studio, enabling data engineers and ETL developers to streamline their troubleshooting processes. With this feature, teams can diagnose performance issues faster by correlating resource utilization patterns—including DPU utilization, memory consumption, CPU load, and data movement size—directly with job log output. Specific use cases include identifying compute bottlenecks, detecting memory pressure or out-of-memory conditions, optimizing resource allocation, and monitoring data pipeline performance at scale. By consolidating metrics and logs into one workspace, organizations can significantly reduce mean time to resolution (MTTR) for ETL pipeline issues and improve overall operational efficiency. This feature is available in https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://docs.aws.amazon.com/sagemaker-unified-studio/latest/adminguide/supported-regions.html&ved=2ahUKEwibs5W0zsiTAxXkyOYEHQmvPTEQFnoECBgQAQ&usg=AOvVaw3H9HTpx1fPYLIDdE0yougk. To access CloudWatch metrics, navigate to any Glue job in SageMaker Unified Studio, open a previous job run, and select the Metrics tab to view comprehensive performance data. To learn more about Amazon SageMaker Unified Studio and this new capability, visit the https://aws.amazon.com/sagemaker/unified-studio/ and see the https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/smus-monitoring-jobs.html.

Amazon SageMaker Unified Studio adds Observability for AWS Glue jobs via CloudWatch metrics

Amazon SageMaker Unified Studio adds Observability for jobs, it now displays Amazon CloudWatch metrics for AWS Glue jobs directly alongside job logs in a single, unified interface...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Studio launches support for Kiro and Cursor IDEs as remote IDEs Today, AWS announces the ability to remotely connect from Kiro and Cursor IDEs to Amazon SageMaker Studio. This new capability allows data scientists, ML engineers, and developers to leverage their Kiro and Cursor setup - including its spec-driven development, conversational coding, and automated feature generation capabilities - while accessing the scalable compute resources of Amazon SageMaker Studio. By connecting Kiro and Cursor to SageMaker Studio using the AWS Toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing agentic development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Studio, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software), and VS Code IDE as remote IDE. Starting today, you can also use your customized local Kiro and Cursor setup - complete with specs, steering files, and hooks - while accessing your compute resources and data on Amazon SageMaker. You can authenticate using the AWS Toolkit extension in Kiro or Cursor or through SageMaker Studio's web interface. Once authenticated, connect to any of your SageMaker Studio development environments in a few simple clicks. You maintain the same security boundaries as SageMaker Studio’s web-based environments while developing AI models and analyzing data in local IDE of your choice - Kiro or Cursor. To learn more, refer to the SageMaker user guide.

🆕 AWS now lets you connect Kiro and Cursor IDEs to Amazon SageMaker Studio for remote development, enabling seamless access to SageMaker's scalable resources while maintaining your local setup's spec-driven features, eliminating context switching.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker Studio launches support for Kiro and Cursor IDEs as remote IDEs Today, AWS announces the ability to remotely connect from Kiro and Cursor IDEs to Amazon SageMaker Studio. This new capability allows data scientists, ML engineers, and developers to leverage their Kiro and Cursor setup - including its spec-driven development, conversational coding, and automated feature generation capabilities - while accessing the scalable compute resources of Amazon SageMaker Studio. By connecting Kiro and Cursor to SageMaker Studio using the AWS Toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing agentic development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Studio, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software), and VS Code IDE as remote IDE. Starting today, you can also use your customized local Kiro and Cursor setup - complete with specs, steering files, and hooks - while accessing your compute resources and data on Amazon SageMaker. You can authenticate using the AWS Toolkit extension in Kiro or Cursor or through SageMaker Studio's web interface. Once authenticated, connect to any of your SageMaker Studio development environments in a few simple clicks. You maintain the same security boundaries as SageMaker Studio’s web-based environments while developing AI models and analyzing data in local IDE of your choice - Kiro or Cursor. To learn more, refer to the https://docs.aws.amazon.com/sagemaker/latest/dg/remote-access.html.

Amazon SageMaker Studio launches support for Kiro and Cursor IDEs as remote IDEs

Today, AWS announces the ability to remotely connect from Kiro and Cursor IDEs to Amazon SageMaker Studio. This new capability allows data scientists, ML engineers, and developers to leverag...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio launches support for remote connection from Cursor IDE Today, AWS announces remote connection from Cursor IDE to Amazon SageMaker Unified Studio via the AWS Toolkit extension. This new capability allows data scientists, ML engineers, and developers to leverage their Cursor setup - including its AI-powered code completion, natural language editing, and multi-file editing capabilities - while accessing the scalable compute resources of Amazon SageMaker. By connecting Cursor to SageMaker Unified Studio using the AWS Toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing AI-assisted development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Unified Studio, part of the next generation of Amazon SageMaker, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software). Starting today, you can also use your customized local Cursor setup - complete with custom rules, extensions, and AI model preferences - while accessing your compute resources and data on Amazon SageMaker. Since Cursor is built on Code-OSS, authentication is secure via IAM through the AWS Toolkit extension, giving you access to all your SageMaker Unified Studio domains and projects. This integration provides a convenient path from your local AI-powered development environment to scalable infrastructure for running workloads across data processing, SQL analytics services like Amazon EMR, AWS Glue, and Amazon Athena, and ML workflows - all with enterprise-grade security including customer-managed encryption keys and AWS IAM integration. This feature is available in all AWS Regions where Amazon SageMaker Unified Studio is available. To learn more, visit the local IDE support documentation..

🆕 AWS unveils Cursor IDE for remote access to Amazon SageMaker Unified Studio via AWS Toolkit, streamlining AI workflows, cutting context switching, and ensuring enterprise security. Available globally with SageMaker Unified Studio.

#AWS #AmazonSagemaker

2 0 1 0
Amazon SageMaker Unified Studio launches support for remote connection from Cursor IDE Today, AWS announces remote connection from Cursor IDE to Amazon SageMaker Unified Studio via the AWS Toolkit extension. This new capability allows data scientists, ML engineers, and developers to leverage their Cursor setup - including its AI-powered code completion, natural language editing, and multi-file editing capabilities - while accessing the scalable compute resources of Amazon SageMaker. By connecting Cursor to SageMaker Unified Studio using the AWS Toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing AI-assisted development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Unified Studio, part of the next generation of Amazon SageMaker, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software). Starting today, you can also use your customized local Cursor setup - complete with custom rules, extensions, and AI model preferences - while accessing your compute resources and data on Amazon SageMaker. Since Cursor is built on Code-OSS, authentication is secure via IAM through the AWS Toolkit extension, giving you access to all your SageMaker Unified Studio domains and projects. This integration provides a convenient path from your local AI-powered development environment to scalable infrastructure for running workloads across data processing, SQL analytics services like Amazon EMR, AWS Glue, and Amazon Athena, and ML workflows - all with enterprise-grade security including customer-managed encryption keys and AWS IAM integration. This feature is available in https://docs.aws.amazon.com/sagemaker-unified-studio/latest/adminguide/supported-regions.html. To learn more, visit the https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/local-ide-support.html..

Amazon SageMaker Unified Studio launches support for remote connection from Cursor IDE

Today, AWS announces remote connection from Cursor IDE to Amazon SageMaker Unified Studio via the AWS Toolkit extension. This new capability allows data scientists, ML engineers, and d...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker AI now supports serverless reinforcement fine-tuning for 12 additional models Amazon SageMaker AI now supports serverless model customization and reinforcement fine-tuning for 12 additional open-weight models, enabling you to fine-tune and evaluate them without provisioning or managing infrastructure. The newly supported models are: gpt-oss-120b, Qwen2.5 72B Instruct, DeepSeek-R1-Distill-Llama-70B, Qwen3 14B, DeepSeek-R1-Distill-Qwen-14B, Qwen2.5 14B Instruct, DeepSeek-R1-Distill-Llama-8B, DeepSeek-R1-Distill-Qwen-7B, Qwen3 4B, Meta Llama 3.2 3B Instruct, Qwen3 1.7B, and DeepSeek-R1-Distill-Qwen-1.5B. With this expansion, you can customize these models using supervised fine-tuning (SFT), direct preference optimization (DPO), and reinforcement fine-tuning (RFT) techniques including RLVR and RLAIF, and only pay for what you use. Reinforcement fine-tuning enables you to align models to complex, domain-specific reasoning tasks where techniques such as traditional SFT alone fall short. With RLVR, you can improve model accuracy on verifiable tasks such as code generation, math, and structured extraction by providing reward signals based on correctness. RLAIF uses AI-generated feedback to steer model behavior toward your quality and safety preferences. These techniques are available on previously supported and newly added models, with no cluster setup, capacity planning, or distributed training expertise required. These models and fine-tuning techniques are available in US East (N. Virginia), US West (Oregon), Asia Pacific (Tokyo), and EU (Ireland). To get started, see the Amazon SageMaker AI model customization product page and visit the Amazon SageMaker AI pricing page (Model Customization tab) to see the full list of models, techniques, and prices.

🆕 Amazon SageMaker AI now supports serverless reinforcement fine-tuning for 12 new models, allowing customization without infrastructure. Pay only for usage. Available in US, Asia Pacific, and EU regions. See Amazon SageMaker AI for details.

#AWS #AmazonSagemaker #AmazonMachineLearning

1 0 0 0
Amazon SageMaker AI now supports serverless reinforcement fine-tuning for 12 additional models Amazon SageMaker AI now supports serverless model customization and reinforcement fine-tuning for 12 additional open-weight models, enabling you to fine-tune and evaluate them without provisioning or managing infrastructure. The newly supported models are: gpt-oss-120b, Qwen2.5 72B Instruct, DeepSeek-R1-Distill-Llama-70B, Qwen3 14B, DeepSeek-R1-Distill-Qwen-14B, Qwen2.5 14B Instruct, DeepSeek-R1-Distill-Llama-8B, DeepSeek-R1-Distill-Qwen-7B, Qwen3 4B, Meta Llama 3.2 3B Instruct, Qwen3 1.7B, and DeepSeek-R1-Distill-Qwen-1.5B. With this expansion, you can customize these models using supervised fine-tuning (SFT), direct preference optimization (DPO), and reinforcement fine-tuning (RFT) techniques including RLVR and RLAIF, and only pay for what you use. Reinforcement fine-tuning enables you to align models to complex, domain-specific reasoning tasks where techniques such as traditional SFT alone fall short. With RLVR, you can improve model accuracy on verifiable tasks such as code generation, math, and structured extraction by providing reward signals based on correctness. RLAIF uses AI-generated feedback to steer model behavior toward your quality and safety preferences. These techniques are available on previously supported and newly added models, with no cluster setup, capacity planning, or distributed training expertise required. These models and fine-tuning techniques are available in US East (N. Virginia), US West (Oregon), Asia Pacific (Tokyo), and EU (Ireland). To get started, see the Amazon SageMaker AI model customization https://aws.amazon.com/sagemaker/ai/model-customization/ and visit the Amazon SageMaker AI https://aws.amazon.com/sagemaker/ai/pricing/ (Model Customization tab) to see the full list of models, techniques, and prices. 

Amazon SageMaker AI now supports serverless reinforcement fine-tuning for 12 additional models

Amazon SageMaker AI now supports serverless model customization and reinforcement fine-tuning for 12 additional open-weight models, enabling you to fine-...

#AWS #AmazonSagemaker #AmazonMachineLearning

1 0 0 0
AWS Weekly Roundup: Claude Sonnet 4.6 in Amazon Bedrock, Kiro in GovCloud Regions, new Agent Plugins, and more (February 23, 2026) Last week, my team met many developers at Developer Week in San Jose. My colleague, Vinicius Senger delivered a great keynote about renascent software—a new way of building and evolving applications where humans and AI collaborate as co-developers using Kiro. Other colleagues, Du’An Lightfoot, Elizabeth Fuentes, Laura Salinas, and Sandhya Subramani spoke about building and […]

AWS Weekly Roundup: Claude Sonnet 4.6 in Amazon Bedrock, Kiro in GovCloud Regions, new Agent Plugins, and more (February 23, 2026)

Last week, my team met many developers at Developer Week in ...

#AWS #AmazonAurora #AmazonBedrock #AmazonEc2 #AmazonNova #AmazonSagemaker #Launch #News #WeekInReview

0 0 0 0
Post image

I've just finished "Amazon SageMaker" course on Udemy #training #formacion #AWS #AmazonSageMaker #MachineLearning #ML

1 0 0 0
Preview
SageMaker Training Plans now enables extending of existing capacity commitments without workload reconfiguration SageMaker Training Plans allows you to reserve GPU capacity within specified time frames in cluster sizes of up to 64 instances. Today, Amazon SageMaker AI announces that Training Plans can now be extended when your AI workloads take longer than anticipated, ensuring uninterrupted access to capacity. You can extend plans by 1-day increments up to 14 days, or 7-day increments up to 182 days (26 weeks). Extensions can be initiated via API or the SageMaker console. Once the extension is purchased the workload continues to run un-interrupted without you needing to reconfgure the workload. SageMaker AI helps you create the most cost-efficient training plans that fits within your timeline and AI budget. Once you create and purchase your training plans, SageMaker automatically provisions the infrastructure and runs the AI workloads on these compute resources without requiring any manual intervention. See the SageMaker AI pricing page for a detailed breakdown of instance availability by AWS Region. To learn more about training plan extensions, see the Amazon SageMaker Training Plans User Guide

🆕 Amazon SageMaker Training Plans now allows extending GPU capacity reservations by 1-7 days up to 182 days without reconfiguring workloads. Extensions can be initiated via API or console, ensuring uninterrupted AI training.

#AWS #AmazonSagemaker

0 0 0 0
SageMaker Training Plans now enables extending of existing capacity commitments without workload reconfiguration https://docs.aws.amazon.com/sagemaker/latest/dg/reserve-capacity-with-training-plans.html allows you to reserve GPU capacity within specified time frames in cluster sizes of up to 64 instances. Today, Amazon SageMaker AI announces that Training Plans can now be extended when your AI workloads take longer than anticipated, ensuring uninterrupted access to capacity. You can extend plans by 1-day increments up to 14 days, or 7-day increments up to 182 days (26 weeks). Extensions can be initiated via API or the SageMaker console. Once the extension is purchased the workload continues to run un-interrupted without you needing to reconfgure the workload. SageMaker AI helps you create the most cost-efficient training plans that fits within your timeline and AI budget. Once you create and purchase your training plans, SageMaker automatically provisions the infrastructure and runs the AI workloads on these compute resources without requiring any manual intervention. See the https://aws.amazon.com/sagemaker/ai/pricing/ for a detailed breakdown of instance availability by AWS Region. To learn more about training plan extensions, see the https://docs.aws.amazon.com/sagemaker/latest/dg/reserve-capacity-with-training-plans.html

SageMaker Training Plans now enables extending of existing capacity commitments without workload reconfiguration

docs.aws.amazon.com/sagemaker/latest/dg/rese... allows you to reserve GPU capacity within specified time frames in c...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio now supports faster data preview in Visual ETL Amazon SageMaker Unified Studio introduces data preview v2.0 for Visual ETL, a new data preview mode that delivers near-instant results when building and iterating on visual ETL jobs. With data preview v2.0, data engineers and analysts can see the output of each transform in about one second, with no session startup required and at no additional compute cost. Data preview v2.0 uses an in-browser query engine to load and process data locally, removing the dependency on server-side Spark sessions for preview operations. Source data is fetched once and cached in the browser, so subsequent transforms apply instantly without re-querying the underlying data source. For Amazon Redshift users, this means you can iterate on transforms without additional queries against your Redshift cluster, keeping your preview workflow fast and your cluster resources focused on production workloads. Data preview v2.0 supports CSV, Parquet, and JSON files from Amazon S3, in addition to data from Amazon Redshift, Amazon S3 Tables, AWS Glue Data Catalog, and third-party sources including Snowflake, MySQL, PostgreSQL, SQL Server, Oracle, Google BigQuery, Amazon DynamoDB, and Amazon DocumentDB. A toggle in the Visual ETL editor gives you the option to switch between data preview v2.0 and the original Spark-based preview at any time. Data preview v2.0 in Visual ETL is available in all AWS Regions where Amazon SageMaker Unified Studio is supported. To learn more, visit the Amazon SageMaker Unified Studio documentation.

🆕 Amazon SageMaker Unified Studio's v2.0 data preview speeds up Visual ETL, delivering near-instant transform results in one second. It uses an in-browser query engine, caches data, and supports CSV, Parquet, JSON, and multiple sources. Available in all AWS Regions.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker Unified Studio now supports faster data preview in Visual ETL https://aws.amazon.com/sagemaker/unified-studio/introduces data preview v2.0 for Visual ETL, a new data preview mode that delivers near-instant results when building and iterating on visual ETL jobs. With data preview v2.0, data engineers and analysts can see the output of each transform in about one second, with no session startup required and at no additional compute cost. Data preview v2.0 uses an in-browser query engine to load and process data locally, removing the dependency on server-side Spark sessions for preview operations. Source data is fetched once and cached in the browser, so subsequent transforms apply instantly without re-querying the underlying data source. For Amazon Redshift users, this means you can iterate on transforms without additional queries against your Redshift cluster, keeping your preview workflow fast and your cluster resources focused on production workloads. Data preview v2.0 supports CSV, Parquet, and JSON files from Amazon S3, in addition to data from Amazon Redshift, Amazon S3 Tables, AWS Glue Data Catalog, and third-party sources including Snowflake, MySQL, PostgreSQL, SQL Server, Oracle, Google BigQuery, Amazon DynamoDB, and Amazon DocumentDB. A toggle in the Visual ETL editor gives you the option to switch between data preview v2.0 and the original Spark-based preview at any time. Data preview v2.0 in Visual ETL is available in all AWS Regions where Amazon SageMaker Unified Studio is supported. To learn more, visit the Amazon SageMaker Unified Studio https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/visual-etl-data-previews.html.

Amazon SageMaker Unified Studio now supports faster data preview in Visual ETL

aws.amazon.com/sagemaker/unified-studio... data preview v2.0 for Visual ETL, a new data preview mode that delivers near-instant results when building and iterating on visual ET...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio adds light mode support for IAM-based domains Today, AWS announces light mode support in Amazon SageMaker Unified Studio for IAM-based domains. Customers can now configure the visual interface mode to match their preference, choosing between dark and light themes. Light mode helps improve readability in bright environments and provides a familiar visual experience for customers who prefer lighter interfaces. Combined with the existing dark mode, this update gives you full control over your development environment's appearance, improving accessibility and reducing eye strain across varying lighting conditions. In SageMaker Unified Studio settings, you can click on 'customize appearance' under your Profile settings to choose between visual modes including dark and light. The setting persists across browsers and devices. This feature is available in all regions where Amazon SageMaker Unified Studio is available. To learn more, refer to the User Guide.

🆕 AWS adds light mode support in Amazon SageMaker Unified Studio for IAM-based domains, letting users choose between dark and light themes for improved readability and accessibility. Available in all regions, settings persist across browsers and devices.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker Unified Studio adds light mode support for IAM-based domains Today, AWS announces light mode support in https://aws.amazon.com/sagemaker/unified-studio/ for IAM-based domains. Customers can now configure the visual interface mode to match their preference, choosing between dark and light themes. Light mode helps improve readability in bright environments and provides a familiar visual experience for customers who prefer lighter interfaces. Combined with the existing dark mode, this update gives you full control over your development environment's appearance, improving accessibility and reducing eye strain across varying lighting conditions. In SageMaker Unified Studio settings, you can click on 'customize appearance' under your Profile settings to choose between visual modes including dark and light. The setting persists across browsers and devices. This feature is available in all regions where Amazon SageMaker Unified Studio is available. To learn more, refer to the https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/navigating-sagemaker-unified-studio.html#display-mode

Amazon SageMaker Unified Studio adds light mode support for IAM-based domains

Today, AWS announces light mode support in https://aws.amazon.com/sagemaker/unified-studio/ for IAM-based domains. Customers can now configure the visual interface mode to match their preferenc...

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio adds metadata sync with third-party catalogs Amazon SageMaker Unified Studio now supports metadata and context sync across Atlan, Collibra, and Alation. These integrations synchronize catalog metadata between Amazon SageMaker Catalog and each partner platform, giving teams a consistent view of their data and AI assets regardless of which tool they use day to day. Organizations can maintain aligned glossary terms, asset descriptions, and ownership information across platforms without manual reconciliation. All three integrations synchronize key metadata elements including projects, assets, descriptions, glossary terms, and their hierarchies. With the Collibra integration, you can synchronize metadata in both directions between SageMaker Catalog and the partner platform, so updates you make in one are reflected in the other. Also, you can manage SageMaker Unified Studio data access requests from Collibra. With the Atlan and Alation integration, you can ingest metadata from SageMaker Catalog into Alation with additional enhancements coming soon. You set up these integrations by setting up a connection to SageMaker Unified Studio from within Atlan and Alation, while the Collibra integration is available as an open-source solution on GitHub. To learn more, visit the Amazon SageMaker Unified Studio documentation. For implementation details, see the Atlan blog post, Collibra blog post , and Alation blog post.

🆕 Amazon SageMaker Unified Studio syncs metadata with Atlan, Collibra, and Alation for consistent data and AI asset views. Key elements like projects and glossary terms sync, with Collibra offering bidirectional sync and data access requests. Integrate via Atlan, Alation, or…

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio now supports AWS Glue 5.1 for data processing jobs Amazon SageMaker Unified Studio now supports AWS Glue 5.1 for Visual ETL, notebook, and code-based data processing jobs. With AWS Glue 5.1 in Amazon SageMaker Unified Studio, data engineers and data scientists can run jobs on Apache Spark 3.5.6 with Python 3.11 and Scala 2.12.18, and use updated open table format libraries including Apache Iceberg 1.10.0, Apache Hudi 1.0.2, and Delta Lake 3.3.2. You can use AWS Glue 5.1 in Amazon SageMaker Unified Studio when creating data processing jobs by selecting Glue 5.1 from the version dropdown in job settings. This applies to Visual ETL jobs, notebook jobs, and code-based jobs, so you can take advantage of the latest Spark runtime and open table format libraries across all your data processing workflows. AWS Glue 5.1 in Amazon SageMaker Unified Studio is available in US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Stockholm), Europe (Frankfurt), Europe (Spain), Asia Pacific (Hong Kong), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Asia Pacific (Malaysia), Asia Pacific (Thailand), Asia Pacific (Mumbai), and South America (Sao Paulo). To learn more, visit the Amazon SageMaker Unified Studio documentation. For details on what's included in AWS Glue 5.1, including updated open table format support and access control capabilities, see the AWS Glue documentation.

🆕 Amazon SageMaker Unified Studio now supports AWS Glue 5.1 for data processing jobs, enabling Visual ETL, notebooks, and code-based jobs with Spark 3.5.6 and updated libraries like Apache Iceberg, Hudi, and Delta Lake. Available in multiple regions.

#AWS #AmazonSagemaker #AwsGlue

0 0 0 0
Amazon SageMaker Unified Studio now supports AWS Glue 5.1 for data processing jobs https://aws.amazon.com/sagemaker/unified-studio/now supports https://aws.amazon.com/about-aws/whats-new/2025/11/aws-glue-5-1/for Visual ETL, notebook, and code-based data processing jobs. With AWS Glue 5.1 in Amazon SageMaker Unified Studio, data engineers and data scientists can run jobs on Apache Spark 3.5.6 with Python 3.11 and Scala 2.12.18, and use updated open table format libraries including Apache Iceberg 1.10.0, Apache Hudi 1.0.2, and Delta Lake 3.3.2. You can use AWS Glue 5.1 in Amazon SageMaker Unified Studio when creating data processing jobs by selecting Glue 5.1 from the version dropdown in job settings. This applies to Visual ETL jobs, notebook jobs, and code-based jobs, so you can take advantage of the latest Spark runtime and open table format libraries across all your data processing workflows. AWS Glue 5.1 in Amazon SageMaker Unified Studio is available in US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Stockholm), Europe (Frankfurt), Europe (Spain), Asia Pacific (Hong Kong), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Asia Pacific (Malaysia), Asia Pacific (Thailand), Asia Pacific (Mumbai), and South America (Sao Paulo). To learn more, visit the Amazon SageMaker Unified Studio https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/smus-creating-jobs.html. For details on what's included in AWS Glue 5.1, including updated open table format support and access control capabilities, see the AWS Glue https://docs.aws.amazon.com/glue/latest/dg/release-notes.html.

Amazon SageMaker Unified Studio now supports AWS Glue 5.1 for data processing jobs

https://aws.amazon.com/sagemaker/unified-studio/now supports aws.amazon.com/about-aws/whats-new/2025... Visual ETL, notebook, and code-based data processi...

#AWS #AmazonSagemaker #AwsGlue

0 0 0 0
Amazon SageMaker Unified Studio adds metadata sync with third-party catalogs https://aws.amazon.com/sagemaker/unified-studio/ now supports metadata and context sync across Atlan, Collibra, and Alation. These integrations synchronize catalog metadata between https://aws.amazon.com/sagemaker/catalog/ and each partner platform, giving teams a consistent view of their data and AI assets regardless of which tool they use day to day. Organizations can maintain aligned glossary terms, asset descriptions, and ownership information across platforms without manual reconciliation. All three integrations synchronize key metadata elements including projects, assets, descriptions, glossary terms, and their hierarchies. With the Collibra integration, you can synchronize metadata in both directions between SageMaker Catalog and the partner platform, so updates you make in one are reflected in the other. Also, you can manage SageMaker Unified Studio data access requests from Collibra. With the Atlan and Alation integration, you can ingest metadata from SageMaker Catalog into Alation with additional enhancements coming soon. You set up these integrations by setting up a connection to SageMaker Unified Studio from within Atlan and Alation, while the Collibra integration is available as an open-source solution on GitHub. To learn more, visit the Amazon SageMaker Unified Studio https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/third-party-catalog-integrations.html. For implementation details, see the https://aws.amazon.com/blogs/big-data/unifying-governance-and-metadata-across-amazon-sagemaker-unified-studio-and-atlan/, https://aws.amazon.com/blogs/big-data/unifying-metadata-governance-across-amazon-sagemaker-and-collibra/, and https://aws.amazon.com/blogs/big-data/build-a-trusted-foundation-for-data-and-ai-using-alation-and-amazon-sagemaker-unified-studio/.

Amazon SageMaker Unified Studio adds metadata sync with third-party catalogs

https://aws.amazon.com/sagemaker/unified-studio/ now supports metadata and context sync across Atlan, Collibra, and Alation. These integrations synchronize catalog metadata between https://aws.a

#AWS #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker Unified Studio launches support for remote connection from Kiro IDE Today, AWS announces the ability to remotely connect from Kiro IDE to Amazon SageMaker Unified Studio. This new capability allows data scientists, ML engineers, and developers to leverage their Kiro setup - including its spec-driven development, conversational coding, and automated feature generation capabilities - while accessing the scalable compute resources of Amazon SageMaker. By connecting Kiro to SageMaker Unified Studio using the AWS toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing agentic development workflows within a single environment for all your AWS analytics and AI/ML services. SageMaker Unified Studio, part of the next generation of Amazon SageMaker, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software). Starting today, you can also use your customized local Kiro setup - complete with specs, steering files, and hooks - while accessing your compute resources and data on Amazon SageMaker. Since Kiro is built on Code-OSS, authentication is secure via IAM through the AWS Toolkit extension, giving you access to all your SageMaker Unified Studio domains and projects. This integration provides a convenient path from your local AI-powered development environment to scalable infrastructure for running workloads across data processing, SQL analytics services like Amazon EMR, AWS Glue, and Amazon Athena, and ML workflows - all with enterprise-grade security including customer-managed encryption keys and AWS IAM integration. This feature is available in all Regions where Amazon SageMaker Unified Studio is available. To learn more, refer to the SageMaker user guide.

🆕 AWS now connects Kiro IDE to Amazon SageMaker Unified Studio, letting data scientists use Kiro's tools with SageMaker's compute, all in one place for smooth analytics and AI/ML workflows.

#AWS #AmazonMachineLearning #AmazonSagemaker

0 0 0 0
Amazon SageMaker Unified Studio launches support for remote connection from Kiro IDE Today, AWS announces the ability to remotely connect from Kiro IDE to Amazon SageMaker Unified Studio. This new capability allows data scientists, ML engineers, and developers to leverage their Kiro setup - including its spec-driven development, conversational coding, and automated feature generation capabilities - while accessing the scalable compute resources of Amazon SageMaker. By connecting Kiro to SageMaker Unified Studio using the AWS toolkit extension, you can eliminate context switching between your local IDE and cloud infrastructure, maintaining your existing agentic development workflows within a single environment for all your AWS analytics and AI/ML services. https://aws.amazon.com/sagemaker/unified-studio/, part of the next generation of Amazon SageMaker, offers a broad set of fully managed cloud interactive development environments (IDE), including JupyterLab and Code Editor based on Code-OSS (Open-Source Software). Starting today, you can also use your customized local Kiro setup - complete with specs, steering files, and hooks - while accessing your compute resources and data on Amazon SageMaker. Since Kiro is built on Code-OSS, authentication is secure via IAM through the AWS Toolkit extension, giving you access to all your SageMaker Unified Studio domains and projects. This integration provides a convenient path from your local AI-powered development environment to scalable infrastructure for running workloads across data processing, SQL analytics services like Amazon EMR, AWS Glue, and Amazon Athena, and ML workflows - all with enterprise-grade security including customer-managed encryption keys and AWS IAM integration. This feature is available in all Regions where Amazon SageMaker Unified Studio is available. To learn more, refer to the https://docs.aws.amazon.com/sagemaker-unified-studio/latest/userguide/local-ide-support.html

Amazon SageMaker Unified Studio launches support for remote connection from Kiro IDE

Today, AWS announces the ability to remotely connect from Kiro IDE to Amazon SageMaker Unified Studio. This new capability allows data scientists, ML engineers, an...

#AWS #AmazonMachineLearning #AmazonSagemaker

0 0 0 0
Preview
Amazon SageMaker HyperPod now supports API-driven Slurm configuration Amazon SageMaker HyperPod now supports API-driven Slurm configuration, enabling you to define Slurm topology and shared filesystem configurations directly in the cluster create and update APIs or through the AWS Console. SageMaker HyperPod helps you provision resilient clusters for running machine learning (ML) workloads and developing state-of-the-art models such as large language models (LLMs), diffusion models, and foundation models (FMs). With this new API-driven configuration, you can now specify Slurm node types including Controller, Login, and Compute for cluster instance groups; instance group to partition mappings; and FSx for Lustre and FSx for OpenZFS filesystem mounts per instance group directly in the cluster API definition or through the advanced configuration section in the AWS Console. When you modify partition-node mappings directly in Slurm's native configuration files to fine-tune cluster resource assignments, Slurm's partition-node configurations can drift from HyperPod's view. A new cluster-level SlurmConfigStrategy helps you manage drift with three options: Managed, Overwrite, and Merge. The Managed strategy allows you to manage instance group to partition mappings completely via the API or Console, and automatically detects drift in partition-to-node mappings during scale-up or scale-down operations. When drift is detected, cluster updates are paused until you resolve it by switching to the Overwrite strategy to force API-defined mappings, the Merge strategy to preserve manual customizations, or by directly updating Slurm configurations to align with HyperPod. API-driven Slurm configuration is available in all AWS Regions where SageMaker HyperPod is available. To get started, you can use the AWS Management Console, AWS CLI, AWS CloudFormation, or AWS SDKs. For more information, see the Amazon SageMaker HyperPod documentation for creating clusters using the Console or the CLI, and the API reference for CreateCluster and UpdateCluster.

🆕 Amazon SageMaker HyperPod now supports API-driven Slurm setup, enabling direct cluster topology and shared filesystem configuration via cluster create/update APIs or AWS Console, managing Slurm partition-node mappings and drift. Available globally.

#AWS #AmazonSagemaker

0 0 0 0
Amazon SageMaker HyperPod now supports API-driven Slurm configuration Amazon SageMaker HyperPod now supports API-driven Slurm configuration, enabling you to define Slurm topology and shared filesystem configurations directly in the cluster create and update APIs or through the AWS Console. SageMaker HyperPod helps you provision resilient clusters for running machine learning (ML) workloads and developing state-of-the-art models such as large language models (LLMs), diffusion models, and foundation models (FMs). With this new API-driven configuration, you can now specify Slurm node types including Controller, Login, and Compute for cluster instance groups; instance group to partition mappings; and FSx for Lustre and FSx for OpenZFS filesystem mounts per instance group directly in the cluster API definition or through the advanced configuration section in the AWS Console. When you modify partition-node mappings directly in Slurm's native configuration files to fine-tune cluster resource assignments, Slurm's partition-node configurations can drift from HyperPod's view. A new cluster-level SlurmConfigStrategy helps you manage drift with three options: Managed, Overwrite, and Merge. The Managed strategy allows you to manage instance group to partition mappings completely via the API or Console, and automatically detects drift in partition-to-node mappings during scale-up or scale-down operations. When drift is detected, cluster updates are paused until you resolve it by switching to the Overwrite strategy to force API-defined mappings, the Merge strategy to preserve manual customizations, or by directly updating Slurm configurations to align with HyperPod. API-driven Slurm configuration is available in all AWS Regions where SageMaker HyperPod is available. To get started, you can use the AWS Management Console, AWS CLI, AWS CloudFormation, or AWS SDKs. For more information, see the Amazon SageMaker HyperPod documentation for creating clusters using the https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-hyperpod-quickstart.html or the https://docs.aws.amazon.com/sagemaker/latest/dg/smcluster-getting-started-slurm-cli.html, and the API reference for https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CreateCluster.html and https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_UpdateCluster.html.

Amazon SageMaker HyperPod now supports API-driven Slurm configuration

Amazon SageMaker HyperPod now supports API-driven Slurm configuration, enabling you to define Slurm topology and shared filesystem configurations directly in the cluster create and update APIs or throu...

#AWS #AmazonSagemaker

0 0 0 0