CloudScoop
UpdatesServicesEOLPricingStatus
Back to feed
GCP·BigQueryfeature·April 13, 2026
AI Insights

To reduce LLM token consumption and query latency when processing large datasets, enable optimized mode using the...


To reduce LLM token consumption and query latency when processing large datasets, enable optimized mode using the following managed AI functions:

• AI.IF • AI.CLASSIFY

This feature is in Preview.

Read original announcement

More from BigQuery

You can now use EXPORT DATA statements to reverse ETL BigQuery data to AlloyDB.Apr 15, 2026BigQuery agent analytics is now generally available (GA) in the Google Agent Developer Kit.Apr 15, 2026A known issue has been resolved where a materialized view refresh could expose could expose masked or filtered data...Apr 15, 2026Support for the AI.AGG function preview has been temporarily disabled.Apr 13, 2026
CloudScoop

Stack awareness for engineering teams.
EOL tracking, CVE alerts, and incident intelligence — scoped to your infrastructure.

Product

Updates FeedLive StatusEOL CalendarPricing ComparisonService Directory

Dashboard

Stack OverviewStack & VendorsStack UpdatesSettings

Account

Sign inGet started free
© 2026 CloudScoop. All rights reserved.
PrivacyTerms