Dive into deep insights and technical expertise ๐Ÿ˜Ž

Friday, June 20, 2025

Managing ServiceNow Storage Effectively: Tips, Pitfalls, and AI Opportunities

 

Managing ServiceNow Storage Effectively: Tips, Pitfalls, and AI Opportunities

๐Ÿง  Introduction

As organizations grow, so does their ServiceNow data. From audit logs to attachments and historical records, an unmanaged instance can quickly exceed storage limits — leading to performance degradation, license breaches, or even functionality risks.

In this article, we explore practical and strategic ways to manage your ServiceNow storage footprint, including smart automation like auto-flush rules and how AI can support proactive cleanup and decision-making.


๐Ÿงฐ 1. Understand What’s Consuming Space

Start by identifying top storage consumers:

  • Navigate to System Diagnostics → Tables to view table sizes.

  • Focus on heavy tables like sys_audit, sys_email, sys_attachment, task, and cmdb_ci.

๐Ÿงพ Script: List tables over 1GB

javascript

var gr = new GlideTableSizeUtil(); var tables = gr.getAllTableSizes(); while (tables.next()) { if (tables.size_bytes > 1073741824) { gs.print(tables.name + " — " + (tables.size_bytes / 1073741824).toFixed(2) + " GB"); } }

๐Ÿ”„ 2. Use Auto-Flush for Log and System Tables

For fast-growing system tables, auto-flush rules are the recommended method to manage data safely and efficiently.

๐Ÿ“ Examples of flushable tables:

  • syslog_transaction

  • syslog

  • sys_email

  • sys_rollback_sequence

  • sys_audit_delete

  • ecc_queue

๐Ÿ› ️ Configure via:

System Definition → Auto Flush Rules

๐Ÿ”ง Example: Auto-flush syslog_transaction older than 30 days:


Table: syslog_transaction Condition: sys_created_on < javascript:gs.daysAgo(30) Batch Size: 500

✅ Auto-flush avoids scripting risks and runs in controlled background processes.


๐Ÿงพ 3. Use Retention Policies for Business Data

For structured business records (like incidents or change requests), use:

  • Auto Archive Rules for historical visibility

  • Auto Delete Rules for permanent cleanup when archive isn’t required

  • Data Retention Policies aligned with legal/compliance frameworks

Avoid deleting directly via script unless absolutely necessary.


๐Ÿค– 4. AI-Driven Optimization (Emerging Practice)

AI models can enhance storage strategies by:

  • Recommending purge targets based on usage frequency

  • Highlighting duplicate or redundant attachments

  • Analyzing long-running jobs for inefficiencies

๐Ÿงพ Script: Identify long-duration scheduled jobs

javascript

var jobLog = new GlideAggregate('sys_trigger'); jobLog.addAggregate('AVG', 'duration'); jobLog.groupBy('name'); jobLog.orderByAggregate('AVG', 'duration'); jobLog.query(); while (jobLog.next()) { gs.print(jobLog.name + " → Avg Duration: " + jobLog.getAggregate('AVG', 'duration') + " ms"); }

๐Ÿ’ก 5. Common Pitfalls to Avoid

  • Overusing scripting to delete data: Prefer system-supported methods like auto-flush or retention rules.

  • Archiving ≠ deleting: Archives still consume space, though less than active records.

  • Uncoordinated full data pulls from PROD can lead to slowness, API throttling, or job failures.

  • Neglecting email and attachment tables, which silently grow large.

๐Ÿงพ Script: Find attachments >50MB, older than 1 year

javascript

var attach = new GlideRecord('sys_attachment'); attach.addEncodedQuery('size_bytes>52428800^sys_created_onRELATIVELE@year@ago@1'); attach.query(); while (attach.next()) { gs.print(attach.file_name + " — " + (attach.size_bytes / 1048576).toFixed(2) + " MB"); }

๐Ÿ” 6. Enforce Limits for Integrations and APIs

  • Use properties like glide.rest.query.max_records to limit API responses.

  • Restrict external integrations from triggering large table queries.

  • Rate-limit ETL tools or enforce best practices on full/delta pulls.


๐Ÿงญ 7. Collaborate with Data Consumers

  • Communicate with data warehouse/ETL teams using Table API.

  • Insist that full-load testing happens in non-prod, never directly on production.

  • Prevent SQL teams from running unattended test queries against live instances.


๐Ÿ“Š System Tables to Auto-Flush or Archive

Table NameReason for GrowthRecommended Action
sys_auditField change logsAuto-archive or delete
sys_emailAll email activityAuto-flush after retention
syslog_transactionTransaction logsAuto-flush older entries
sys_rollback_sequenceWorkflow rollback dataAuto-flush
sys_attachment_docFile storageIdentify & purge large/old

๐Ÿ“Œ Conclusion

Managing storage in ServiceNow is about more than just saving disk space. It’s a proactive approach to maintaining performance, cost efficiency, and platform health. With the right mix of auto-flush, retention rules, and even AI-enhanced analysis, you can keep your instance lean and compliant — while avoiding manual, error-prone deletion methods.

Share:

Enterprise Tips – Secure Credentials, Proxy Settings, and MID Server Options for PowerShell + ServiceNow API

 

Enterprise Tips – Secure Credentials, Proxy Settings, and MID Server Options for PowerShell + ServiceNow API

๐Ÿ›ก️ Introduction

So far in this series, we’ve explored how to connect PowerShell to the ServiceNow Table API, handle errors, and optimize performance. But in enterprise environments, you’ll run into real-world constraints like:

  • Secure credential storage

  • Network proxies

  • Internal ServiceNow instances behind firewalls

  • Compliance restrictions

In this final article, we cover how to run secure, robust API integrations in production environments using best practices and ServiceNow architecture features.


๐Ÿ” 1. Securely Store and Use Credentials

Hardcoding usernames and passwords in scripts is a security risk. Use these safer alternatives:

✅ Windows Credential Manager (for PowerShell)

Store credentials once, then retrieve them securely in your script:

powershell

$creds = Get-StoredCredential -Target "SNOW_API_CRED" $user = $creds.Username $pass = $creds.Password

To save it (one-time setup):

powershell

New-StoredCredential -Target "SNOW_API_CRED" -UserName "admin" -Password "your_password" -Persist LocalMachine

You can use modules like CredentialManager or SecretManagement from the PowerShell Gallery.


✅ Secure Vaults (for enterprise)

If you’re in a DevOps setup, integrate with:

  • Azure Key Vault

  • HashiCorp Vault

  • AWS Secrets Manager

This ensures your scripts never expose plaintext secrets.


๐ŸŒ 2. Use Proxy Settings When Required

Corporate environments often require internet access via proxy. PowerShell supports this:

powershell

$proxy = New-Object System.Net.WebProxy("http://proxy.company.com:8080") $handler = New-Object System.Net.Http.HttpClientHandler $handler.Proxy = $proxy $client = [System.Net.Http.HttpClient]::new($handler) $response = $client.GetAsync($url).Result

Or for Invoke-RestMethod (basic use):

powershell

Invoke-RestMethod -Uri $url -Proxy "http://proxy.company.com:8080" -Headers $headers

๐Ÿ” Note: Some proxies also require authentication.


๐Ÿข 3. Using MID Server as an Alternative to Direct API Calls

If ServiceNow is hosted internally or API access is restricted externally, a MID Server is the best approach.

✅ What’s a MID Server?

A Management, Instrumentation, and Discovery (MID) Server is a lightweight Java process that sits inside your network and acts as a secure bridge between ServiceNow and internal systems.

✅ Use Cases:

  • When the target system is on-premise (ServiceNow can’t reach it)

  • When you don’t want to expose public API endpoints

  • When API calls need to run behind a proxy or firewall


๐Ÿ” MID Server & Scripted REST

You can create a Scripted REST API in a Scoped App that:

  • Accepts data pushed by PowerShell scripts

  • Processes the data inside ServiceNow (via MID Server, if needed)

Or use Orchestration + MID Server to:

  • Trigger PowerShell scripts via Workflow or Flow Designer

  • Pull results back into ServiceNow


๐Ÿงช Bonus Tips

  • Use API throttling best practices: no more than 100 calls/minute per user

  • Rotate OAuth tokens and secrets periodically

  • Use roles and ACLs to limit API access in ServiceNow

  • Log sensitive API interactions securely


๐Ÿงญ Conclusion

Running PowerShell integrations with ServiceNow at scale requires more than just syntax — it takes planning for security, scalability, and reliability. By using secure credential storage, handling proxies correctly, and understanding MID Server architecture, you ensure your automation is enterprise-ready and compliant.

Share:

Optimizing Performance – Pagination, Filtering, and Query Design in PowerShell + ServiceNow API

 

Optimizing Performance – Pagination, Filtering, and Query Design in PowerShell + ServiceNow API

๐Ÿš€ Introduction

Once you’ve connected PowerShell to the ServiceNow Table API, the next big challenge is performance. Without the right approach, even a simple query can lead to:

  • Timeouts

  • Empty responses

  • Crashed scripts

  • Overloaded servers

This article covers 3 powerful techniques to optimize your integration:

  1. Pagination

  2. Field filtering

  3. Efficient sysparm_query usage


๐Ÿ” 1. Use Pagination (sysparm_limit and sysparm_offset)

By default, ServiceNow doesn’t return all records — and if you try to force it, your script will timeout.

✅ Best Practice

powershell

$limit = 100 $offset = 0 $headers = @{ "Authorization" = "Bearer $accessToken" } $instance = "dev12345" $url = "https://$instance.service-now.com/api/now/table/incident" do { $pagedUrl = "$url?sysparm_limit=$limit&sysparm_offset=$offset" $response = Invoke-RestMethod -Uri $pagedUrl -Headers $headers $results = $response.result foreach ($record in $results) { Write-Output $record.number } $offset += $limit } while ($results.Count -gt 0)

This loop pulls 100 records at a time — scalable, safe, and efficient.


๐ŸŽฏ 2. Use sysparm_fields to Limit Response Size

By default, every API call returns all fields — even huge ones like work_notes, attachments, etc.

✅ Fix:

powershell

$url = "https://$instance.service-now.com/api/now/table/incident?sysparm_fields=number,short_description,state"

This dramatically reduces payload size and speeds up execution.


๐Ÿง  3. Optimize Your sysparm_query Filter

Filtering is where most performance issues happen — especially when:

  • You use dot-walked fields

  • You use LIKE queries

  • You don’t filter by time or indexed fields

❌ Bad:


caller_id.nameLIKEjohn

Causes joins and slowdowns.

✅ Good:


caller_id=681ccaf9c0a8016401c5a33be04be441

✅ Add Time-Based Filters

Always use sys_updated_on or closed_at to narrow large tables:


sys_updated_on>javascript:gs.daysAgoStart(30)

๐Ÿ› ️ Bonus: Combine All Techniques

powershell

$limit = 100 $offset = 0 $query = "active=true^sys_updated_on>javascript:gs.daysAgoStart(30)" $encodedQuery = [System.Web.HttpUtility]::UrlEncode($query) do { $url = "https://$instance.service-now.com/api/now/table/incident?sysparm_query=$encodedQuery&sysparm_limit=$limit&sysparm_offset=$offset&sysparm_fields=number,short_description,state" $response = Invoke-RestMethod -Uri $url -Headers $headers $results = $response.result foreach ($incident in $results) { Write-Host "$($incident.number): $($incident.short_description)" } $offset += $limit } while ($results.Count -gt 0)

✅ Summary Checklist

OptimizationBenefit
sysparm_limit + offsetPrevents timeouts, enables large pulls
sysparm_fieldsReduces payload, faster API
Use sys_id instead of namesAvoids joins
Filter on sys_updated_onNarrows down queries
Avoid dot-walked or LIKE filtersPrevents performance bottlenecks

๐Ÿงญ Conclusion

A well-optimized query can save hours in execution time and avoid failed automations. These techniques are essential for scaling your PowerShell + ServiceNow integration reliably.

In the next article, we’ll tackle real-world security and enterprise deployment tips, including proxies, secrets, and MID server considerations.

Share:

Handling Common API Errors and Timeouts When Connecting to ServiceNow

 

Error Handling and Timeouts

⚠️ Introduction

Sometimes your PowerShell script returns a 200 OK response from the ServiceNow API — but nothing works. No records, incomplete data, or even an error inside the payload. This happens more often than you'd expect, especially on large tables like incident, task, or cmdb_ci.

In this article, we'll break down:

  • Why ServiceNow returns errors inside successful HTTP responses

  • What causes API timeouts and transaction cancellations

  • How to detect, debug, and resolve them in your PowerShell integration


๐Ÿงจ Problem 1: "Transaction Cancelled – Maximum Execution Time Exceeded"

This error happens when the server-side query takes too long to process. You’ll see:

json

{ "error": { "message": "Transaction cancelled: maximum execution time exceeded. Check logs for error trace or enable glide.rest.debug property to verify REST request processing." }, "status": "200 OK" }

Yes — it says 200. But it’s a failure.


๐Ÿ” Why This Happens

  • You're pulling too many records at once

  • Filters use unindexed or dot-walked fields

  • Long-running Business Rules or Flows are slowing the query

  • You forgot to paginate or narrow the date range


✅ Fix It With PowerShell

powershell

# Correct use of pagination and fields $limit = 100 $offset = 0 $url = "https://$instance.service-now.com/api/now/table/incident?sysparm_limit=$limit&sysparm_offset=$offset&sysparm_fields=number,short_description,state" $response = Invoke-RestMethod -Uri $url -Headers $headers $response.result

๐Ÿ“Œ Tip: Always use sysparm_limit, sysparm_offset, and sysparm_fields to reduce payload size.


❗ Problem 2: Dot-Walked Field Filters Kill Performance

Filters like this:


caller_id.department.name=Finance

…are slow and prone to failure because they introduce implicit SQL joins.


✅ Fix

Use direct sys_id values:

powershell

$filter = "caller_id=6816f79cc0a8016401c5a33be04be441" $url = "https://$instance.service-now.com/api/now/table/incident?sysparm_query=$filter"

❗ Problem 3: PowerShell Doesn’t Detect Embedded Errors

Even when ServiceNow returns a 200 OK, the real error is inside the JSON.

✅ Add a Safety Check:

powershell

$response = Invoke-RestMethod -Uri $url -Method Get -Headers $headers if ($response.error) { Write-Host "❌ API Error: $($response.error.message)" } else { Write-Host "✅ Records returned: $($response.result.Count)" }

⚙️ Extra Debugging Tools

  • Enable REST logs in ServiceNow
    Set property: glide.rest.debug = true (only temporarily)

  • Check syslog for REST errors
    Navigate to: System Logs → Errors

  • Use Postman to isolate whether the issue is with PowerShell or the API call itself


๐Ÿงญ Conclusion

When PowerShell meets ServiceNow, performance and error handling are everything. Don't be fooled by a 200 status code — always check for hidden error payloads and optimize your queries.

In the next article, we’ll look at performance tuning techniques, including filtering best practices and sysparm_query tricks.

Share:

Getting Started – PowerShell + ServiceNow Table API with Authentication

 

PowerShell + ServiceNow Integration

๐Ÿš€ Introduction

ServiceNow’s Table API provides full CRUD access to any record in the platform. Combine that with PowerShell, and you unlock the ability to automate ticketing, compliance tracking, CMDB updates, and more — right from the command line.

In this article, you’ll learn how to:

  • Authenticate using Basic Auth and OAuth2

  • Make your first Table API call from PowerShell

  • Parse and handle JSON responses

  • Set the stage for advanced integration in later posts


๐Ÿ” 1. Basic Authentication Setup

This is the simplest approach, but not recommended for production.

powershell

# Replace with your instance and credentials $instance = "dev12345" $user = "admin" $pass = "your_password" $base64Auth = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes("$user`:$pass")) # Define headers and URL $headers = @{ "Authorization" = "Basic $base64Auth" "Accept" = "application/json" } $url = "https://$instance.service-now.com/api/now/table/incident?sysparm_limit=1" # Call the API $response = Invoke-RestMethod -Uri $url -Method Get -Headers $headers # Output the result $response.result

⚠️ Tip: Avoid hardcoding passwords. We’ll cover secure handling in Part 4.


๐Ÿ”‘ 2. OAuth2 (Recommended for Secure Use)

OAuth2 gives you token-based access, ideal for enterprise environments.

๐Ÿ“Œ Prerequisites:

  • OAuth enabled in ServiceNow

  • A registered app with Client ID and Client Secret

  • A user account with API access

powershell

# Credentials and endpoint $clientId = "your_client_id" $clientSecret = "your_client_secret" $username = "admin" $password = "your_password" $instance = "dev12345" $tokenUrl = "https://$instance.service-now.com/oauth_token.do" # Build request body $body = @{ grant_type = "password" client_id = $clientId client_secret = $clientSecret username = $username password = $password } # Get token $response = Invoke-RestMethod -Uri $tokenUrl -Method Post -Body $body -ContentType "application/x-www-form-urlencoded" $accessToken = $response.access_token # Make Table API request $headers = @{ "Authorization" = "Bearer $accessToken" "Accept" = "application/json" } $url = "https://$instance.service-now.com/api/now/table/incident?sysparm_limit=1" $data = Invoke-RestMethod -Uri $url -Method Get -Headers $headers $data.result

๐Ÿ”Ž 3. What You Should See

A single incident record, structured in JSON:

json

{ "result": [ { "number": "INC0010001", "short_description": "Sample Incident", "state": "1", "sys_id": "abc123..." } ] }

๐Ÿงญ Conclusion

Connecting PowerShell to ServiceNow via the Table API is a powerful step toward automation. Whether you're managing incidents, risks, or CMDB items, understanding authentication methods is key.

In future articles, we’ll make this integration enterprise-grade with error handling, secure storage, and better performance.

Share:

InformativeTechnicalContent.com