RisiAi Logo
RisiAi Tech News
Daily Brief

Gemma 4 Debuts as AWS Ends SSE-C by Default and EU Delays AI Act

daily tech

Gemma 4 Debuts as AWS Ends SSE-C by Default and EU Delays AI Act

AI & Machine Learning

Google released Gemma 4 this week, positioning the model family as its most capable open-weight models to date and emphasizing mobile-first, agentic workflows that developers can run locally or in-cloud. The announcement highlights Apache-2.0 licensing and a focus on making powerful reasoning and multimodal capabilities broadly accessible to research and product teams, signaling a continued push by major cloud providers to offer open alternatives to proprietary large models. Gemma 4’s release is likely to reshape developer expectations for on-device and on-prem inference and accelerate competition around model portability and licensing. For builders and enterprises, the immediate questions will be integration paths, attack-surface changes for local deployments, and how Gemma 4 affects the economics of running sophisticated agents. Source: Google AI Blog Verified: True

OpenAI closed a record-setting funding round that industry reporting says totaled roughly $122 billion and produced a post-money valuation widely reported near $852 billion, a milestone that puts the company among the highest-valued private technology firms. The financing round, reported publicly at the end of March, includes large strategic backers and is widely seen as a pre-IPO move to scale compute, product development, and enterprise offerings. The size and timing of the raise reshape competitive dynamics in the AI cloud ecosystem, increasing pressure on rivals and raising fresh questions about governance, market concentration, and regulatory scrutiny ahead of any public listing. For customers and partners, the round signals deep capital commitment to accelerated roadmap delivery but also elevates strategic dependence on one dominant platform in AI tooling and models. Source: CNBC Verified: True

Consumer Hardware

No major stories this sector today.

Cybersecurity

Researchers and commentators reported this week on an AI-assisted chain that produced a working FreeBSD kernel exploit in hours, underscoring how generative tools can shorten the time from vulnerability discovery to exploit development. The headlines focus on a specific CVE and rapid exploit proof-of-concept work that prompted FreeBSD security advisories and emergency patches, illustrating the dual-use risk of advanced code-generation models used for offensive security purposes. The incident has renewed calls for faster coordinated disclosure processes, better upstream patch management, and defensive investments in automated detection and memory-safety mitigations across BSD and Linux distributions. Organizations that rely on FreeBSD or embedded BSD-derived systems should audit exposed services, prioritize the vendor advisories, and accelerate patch testing and rollouts. Source: Forbes Verified: True

Enterprise Infrastructure

Amazon Web Services announced an April 2026 change to default encryption settings for S3 that will disable server-side encryption with customer-provided keys (SSE-C) for new buckets and select existing buckets, steering users toward SSE-KMS or client-side encryption. AWS framed the move as a security best practice to reduce a ransomware vector and simplify key management, but it will require some customers to change deployment automation or rework compliance mappings where customer-supplied keys were central. The change highlights the trade-off between customer control and managed security posture, and enterprises using SSE-C should review the AWS FAQ and migration guidance to avoid unexpected access or backup issues. Security teams need to inventory current SSE-C usage, validate key retention practices, and plan migrations to KMS or CSE well ahead of enforcement windows. Source: AWS Storage Blog Verified: True

NVIDIA’s GTC conference this week showcased multiple infrastructure-focused announcements, including new Vera Rubin production deployments and advances in inference-per-watt claims that the company says will accelerate agentic AI and large-scale model serving. The event emphasized end-to-end system updates — from new BlueField networking platforms to inference-optimized stacks — that are positioned to lower the barrier for enterprises building latency-sensitive AI services. For cloud providers, chip-makers and enterprise data centers, the announcements signal another step in the arms race for specialized hardware and software stacks that optimize cost, power, and throughput for next‑generation models. Customers should watch availability timelines and ecosystem integrations closely, because adoption decisions over the next 12–18 months will lock in substantial infrastructure spend. Source: NVIDIA Blog (GTC coverage) Verified: True

Policy & Regulation

The European Parliament voted this month to postpone application deadlines for key high‑risk provisions of the EU Artificial Intelligence Act, moving some compliance dates into late 2027 to allow more time for standards, enforcement planning, and technical guidance. MEPs also backed a set of carve-outs and clarifications — including explicit bans on certain “nudifier” apps — while seeking clearer timelines and implementation steps for biometric and critical‑infrastructure systems. The delay eases immediate compliance pressure on vendors and public sector adopters but increases uncertainty about when harmonized rules and certification schemes will actually be enforced, complicating legal planning for multinational providers. Companies operating in EU markets should continue preparing for stricter rules while tracking the evolving timetable and engaging with standard bodies to influence technical requirements. Source: European Parliament press room Verified: True