top of page

    Top 50 Vlocity, OmniStudio & Salesforce Interview Questions with Answers

    • Writer: VlocityGru
      VlocityGru
    • 1 day ago
    • 17 min read

    Updated: 15 hours ago

    Currently 31 questions — updated regularly. Last updated: April 2026


    For Freshers and Experienced  |  2025-2026


    Whether you are a fresher stepping into the Salesforce ecosystem or an experienced developer preparing for your next big role, this guide covers the most commonly asked interview questions on Salesforce Core, Vlocity, OmniStudio, LWC, and more. All questions in this list are based on real interview experiences.

     

    Q1.  Why do we deactivate a user in Salesforce instead of deleting them?

    Answer:

    In Salesforce, you cannot delete a user once they have been created — and this is by design.

    The core reason is data ownership. Most records in Salesforce (Accounts, Opportunities, Cases, etc.) are owned by users. Deleting a user would orphan all those records, causing serious data integrity problems.

    Additionally, deleted users would break audit trails, sharing rules, approval processes, and workflow assignments that reference them.

    The correct approach is to Deactivate the user. A deactivated user:

    • Cannot log in to Salesforce

    • Does not consume a license

    • Still appears as the record owner, preserving all data relationships

    • Can be reactivated at any time if needed

    Pro Tip: Before deactivating, always reassign their open tasks, cases, and owned records to another active user.


    Q2.  Without using a Trigger or Flow, how can you prevent duplicate records based on First Name and Last Name?

    Answer:

    Approach 1 — Duplicate Management (No Code)

    Salesforce has a built-in Duplicate Management feature that handles this declaratively.

    Step 1 - Create a Matching Rule: Go to Setup > Matching Rules > New Rule. Select your object (e.g., Contact). Set matching criteria: First Name = Exact, Last Name = Exact. Activate the rule.

    Step 2 - Create a Duplicate Rule: Go to Setup > Duplicate Rules > New Rule. Link it to your Matching Rule. Set the action to Block (prevents saving entirely) or Allow with Alert (warns the user but allows saving).

    Once active, Salesforce automatically checks for duplicates on every create and update — zero code required.

    Approach 2 — Before Save Record-Triggered Flow

    If you need more custom logic (e.g., show a specific error message or check additional conditions), use a Before Save Record-Triggered Flow:

    • Trigger: Before Save, on Create and Update

    • Add a Get Records element to check if a record with the same First Name + Last Name already exists

    • If found, use a Custom Error element to block the save and display your message


    Q3.  If 200 Contact records are inserted and 10 of them have both Email and Phone empty, does the entire batch fail or only those 10 records?

    Answer:

    Only the 10 invalid records fail. The remaining 190 are inserted successfully.

    This is because addError() in Salesforce triggers is record-specific. Calling record.addError('message') marks only that individual record as failed without rolling back the rest of the transaction.

    Example logic inside the trigger:

    • Loop through Trigger.new

    • If record.Email == null AND record.Phone == null, call record.addError('Email or Phone is required')

    • Salesforce processes the rest of the batch normally

    Important distinction: If an unhandled exception is thrown (like a NullPointerException) instead of addError(), then the entire batch of 200 records would fail. Always use addError() for controlled, record-level validation.


    Q4.  What is Shield Encryption in Salesforce and how does it work with Apex?

    Answer:

    Salesforce Shield Platform Encryption allows you to encrypt sensitive data at rest while allowing Salesforce to function normally.

    How to enable it:

    • Go to Setup > Encryption Policy > Encrypt Fields

    • Select the fields you want to encrypt (e.g., SSN, bank account, health data)

    • Salesforce uses a tenant-specific encryption key to protect the data

    How it works in Apex:

    • Apex can read and write encrypted fields without any special syntax

    • Whether a user sees the real value or masked value (e.g., *******) depends on their permissions

    • To grant visibility: Setup > Permission Sets > System Permissions > Enable 'View Encrypted Data'

    • Users without this permission always see masked values

    Key use case: Encrypting PII (Personally Identifiable Information) such as Social Security Numbers, financial data, or health records.


    Q5.  What causes the 'Too many SOQL queries: 101' exception in Salesforce?

    Answer:

    Salesforce enforces a governor limit of 100 SOQL queries per transaction. Exceeding this throws: System.LimitException: Too many SOQL queries: 101

    Common causes:

    • Running a SOQL query inside a for loop (the most frequent mistake)

    • Calling SOQL inside LoopBlock in Integration Procedure.

    • Calling multiple Apex helper methods that each internally execute SOQL queries

    • Lookup field traversal in Apex (e.g., record.Account.Name) can trigger hidden queries

    • Calling the same method multiple times in a batch context

    How to fix:

    • Always move SOQL queries outside of loops

    • Use Maps to store and look up query results inside loops

    • Review all helper methods for hidden SOQL usage

    Note: Even if no explicit SOQL is written in the IP, Vlocity DataRaptor Extracts count toward the SOQL governor limit — which is why IPs with complex DR chains can hit this error.


    Q6.  What causes the 'Too many DML statements: 151' exception?

    Answer:

    Salesforce limits DML operations (insert, update, delete, upsert, undelete) to 150 per transaction.

    Common causes:

    • Performing DML inside a for loop

    • Chained triggers, flows, or process builders all firing DML in the same transaction

    • Calling helper methods that each perform their own DML

    Best practice: Collect all records into a List first, then perform a single DML operation on the entire list outside the loop. This is called Bulkification and is a core Apex requirement.


    Q7.  What is Mixed DML Exception and how do you resolve it?

    Answer:

    Mixed DML Exception occurs when you try to perform DML on both Setup objects and Non-Setup objects in the same transaction.

    Setup objects: User, PermissionSet, PermissionSetAssignment, Group, GroupMember, etc.

    Non-Setup objects: Account, Contact, Opportunity, custom objects, etc.

    Salesforce separates these because Setup changes affect the security and metadata layer, requiring special processing that cannot be mixed with standard DML.

    How to fix:

    • Run the Setup object DML in a separate transaction using an @future method

    • Or use System.runAs() in test classes to isolate the operations

    • Or split the logic into separate Queueable jobs


    Q8.  What is the 'Uncommitted Work Pending' error and how do you prevent it?

    Answer:

    This error occurs when a DML operation (insert, update, delete) is performed before making an HTTP callout in the same transaction.

    Salesforce blocks this because if a DML runs first and then the callout fails, the transaction cannot be safely rolled back since the external system may have already received a partial call.

    Error message: 'You have uncommitted work pending. Please commit or rollback before calling out.'

    How to fix:

    • Always perform your HTTP callouts BEFORE any DML operations in the same transaction

    • Always place HTTP Datamapper Load and other DML operations after HTTP action in Integration Procedures.

    • Or separate the DML and callout into different transactions using @future(callout=true) or Queueable

    • SOQL queries before callouts are perfectly fine — only DML triggers this error


    Q9.  How does Role Hierarchy work in Salesforce and how do you configure it for Custom Objects?

    Answer:

    Role Hierarchy controls record visibility. Users higher in the hierarchy automatically gain read/edit access to records owned by users below them.

    For Standard Objects:

    • The 'Grant Access Using Hierarchies' checkbox is enabled by default

    • No additional setup needed — managers automatically see subordinates' records

    For Custom Objects:

    • This checkbox is NOT enabled by default

    • You must manually enable it: Setup > Object Manager > [Your Object] > Edit > Check 'Grant Access Using Hierarchies'

    Important: This is an object-level setting, not a record-level setting. Once enabled, it applies to all records of that object.

    Remember: Role Hierarchy provides visibility upward — it does not restrict access.


    Q10.  What is the difference between Custom Metadata Types and Custom Settings?

    Answer:

    Both are used to store configuration data, but they behave very differently:

    Custom Metadata Types:

    • Treated as metadata — fully deployable.

    • Records migrate with deployments, no need to recreate them in each org

    • Does NOT support Create, Update, or Delete (CUD) operations at runtime via DML

    • Does NOT require SeeAllData=true in test classes

    • Best for: configuration values, feature flags, field mappings

    Custom Settings:

    • Behaves like a custom object — supports DML at runtime

    • NOT deployable (data must be manually recreated in each org)

    • Requires SeeAllData=true in test classes to access existing data

    • Two types: List (org-wide) and Hierarchy (per org/profile/user)

    Rule of thumb: Use Custom Metadata for configuration you want to deploy. Use Custom Settings for runtime-editable configuration.


    Q11.  What are the two types of Custom Settings and how do they differ?

    Answer:

    1. List Custom Settings:

    • Data is the same for all users across the entire org

    • Accessed using: MySettings__c.getAll() or MySettings__c.getInstance('RecordName')

    • Use case: Org-wide configuration like API endpoints or feature toggles

    2. Hierarchy Custom Settings:

    • More intelligent — Salesforce checks three levels in order: User > Profile > Org

    • Returns the most specific value available for the current user

    • Accessed using: MySettings__c.getInstance() — automatically returns the right level

    • Use case: Settings that differ per user or profile, like page size preferences or regional configurations


    Q12.  What is the difference between @wire and Imperative Apex calls in LWC?

    Answer:

    @wire (Reactive):

    • Automatically re-executes when tracked properties or record data changes

    • Managed by the Salesforce wire service — data is fetched and cached automatically

    • Read-only data pattern

    • Best for: displaying record data that should refresh when the record is edited

    Imperative call (Controlled):

    • You decide exactly when the Apex method runs (button click, condition, form submit)

    • Does not auto-refresh — you call it manually

    • Supports full error handling and conditional logic

    • Best for: actions triggered by the user, form submissions, or complex decision-based logic

    Simple rule: Wire = automatic and reactive. Imperative = manual and controlled.


    Q13.  What are the LWC Lifecycle Hooks and when does each one execute?

    Answer:

    constructor() — Runs when the component instance is first created. Use only to set default property values. Do NOT access the DOM here.

    connectedCallback() — Runs when the component is inserted into the DOM. Best place to call Apex, initialize data, subscribe to LMS channels or custom events.

    render() — Used for conditional template rendering (returning different HTML templates based on logic).

    renderedCallback() — Runs after every render and re-render. Use for DOM-dependent operations (input focus, third-party JS libraries). Always use a boolean flag to prevent infinite loops.

    disconnectedCallback() — Runs when the component is removed from the DOM. Use to unsubscribe from LMS, remove event listeners, and clear timers.

    errorCallback(error, stack) — Runs if a child component throws an unhandled error. Use to log errors or show a fallback UI.

    Note: If the page is refreshed or tab is closed and reopened, constructor and connectedCallback execute again from scratch. They do NOT re-run just because a reactive property changes.


    Q14.  What are the three decorators available in LWC and what does each one do?

    Answer:

    @api — Marks a property or method as public. Parent components can pass data into the child using @api properties and call @api methods from outside. @api properties are reactive.

    @track — Makes a property deeply reactive. In modern LWC (API 39+), all properties are reactive for primitive changes by default. @track is needed only when you need reactivity on nested object or array mutations.

    @wire — Connects a property or function to a Salesforce data source such as an Apex method or wire adapters (getRecord, getPicklistValues, etc.). The component automatically re-renders when wired data changes.


    Q15.  How do you pass data from a Grand Parent component to a Grand Child component in LWC?

    Answer:

    There are three common approaches:

    1. @api (Property Drilling): Grand Parent passes data to Parent via @api, and Parent passes it further down to Grand Child via @api. Simple but becomes messy with many levels.

    2. Pub/Sub Pattern: Grand Parent publishes an event on a named channel. Grand Child subscribes to the same channel. They communicate without a direct relationship. Works within the same page.

    3. Lightning Message Service (LMS) — Recommended: Uses a Message Channel defined as Salesforce metadata. Grand Parent publishes to the channel, Grand Child subscribes. Works across different areas of the page, even components not in the same DOM tree.

    Best practice: Use LMS for cross-component communication in production apps. Use @api only for direct parent-to-child data passing.


    Q16.  What is the difference between an HTTP Action and a Remote Action in an Integration Procedure?

    Answer:

    Both are used for server-side calls from an Integration Procedure, but they serve different purposes:

    HTTP Action:

    • Used to make REST callouts to external systems (third-party APIs, middleware, external services)

    • Communicates over HTTP/HTTPS using standard request/response format

    • Best for: calling external APIs or integration endpoints

    Remote Action:

    • Used to invoke Apex classes directly within Salesforce

    • Supports SOQL queries, DML operations, and respects Apex governor limits

    • Handles structured Apex error formats and can contain complex business logic

    • Best for: data processing that requires DML, SOQL, or Apex governor limit awareness

    Can you embed an HTTP callout inside an Apex class called via Remote Action?

    Yes, and the result is the same. However, be careful: if the IP has already executed a DML step earlier in the same transaction, the Apex class making the callout will throw an Uncommitted Work Pending error.


    Q17.  In an Integration Procedure with the sequence DR Load > Apex > DR Load > HTTP Action, if the HTTP Action fails, does everything roll back?

    Answer:

    The sequence is: DR Load > Apex > DR Load > HTTP Action

    First, important clarification — in OmniStudio, DR Load writes data to Salesforce (DML), it is NOT a read operation. DR Extract is the read operation.


    What happens step by step:

    Step 1 — DR Load runs → DML is performed. Data is written to Salesforce but sits in an uncommitted/pending state inside the transaction. It is NOT permanently saved yet.

    Step 2 — Apex runs → If Apex also performs any DML, it adds to the same pending transaction.

    Step 3 — Second DR Load runs → More DML added to the same pending transaction.

    Step 4 — HTTP Action tries to run → Salesforce immediately throws "Uncommitted Work Pending" error because there is pending DML in the same transaction before a callout. The HTTP Action never actually executes.

    Result → Because the transaction failed, Salesforce automatically rolls back ALL the DR Load data from both steps. Nothing is permanently saved in the database.


    Will everything rollback? YES

    Because all steps are in the same transaction — when the error is thrown, Salesforce rolls back the entire transaction including both DR Load operations.


    How to fix this sequence:

    Always place HTTP callouts BEFORE any DML operations. Correct sequence should be:

    DR Extract > HTTP Action > DR Load


    Key Rule to Remember:

    Same transaction + error = full automatic rollback. Separate transaction + error = no rollback possible.


    Q18.  A FlexCard is placed on a Record Page with two buttons that need to show/hide conditionally. How do you make this work generically for both Standard and Custom Objects?

    Answer:

    When a FlexCard is placed on a Lightning Record Page, Salesforce automatically injects two context variables:

    • recordId — the Id of the currently viewed record

    • objectApiName — the API name of the object (e.g., Account, My_Custom_Object__c)

    You can use these context variables directly in the FlexCard Conditional Rendering expressions for each button. Because objectApiName is provided dynamically by the platform, the same FlexCard works generically across any object without hardcoding.

    Example: Show Button A only when objectApiName == 'Account', or use a field value from recordId to drive a data condition.

    Important caveat: objectApiName is only available on Record Pages. If the FlexCard is placed on a Home Page or App Page, objectApiName will be null and your condition will not work — you would need to pass the object context manually.


    Q19.  You have an array [Apple, Banana, Kiwi, Mango] in an Integration Procedure. How do you separate values at even index positions from values at odd index positions?

    Answer:

    This requires index-based conditional logic, which a DataRaptor cannot handle — DataRaptors do not support position-based or index-based processing.

    The correct tool is a Loop Block inside the Integration Procedure:

    • Add a Loop Block and iterate over the array

    • Inside the loop, use a Conditional Block with the expression: %LoopIndex% % 2 == 0

    • If true (even index) — add the value to an 'EvenList' using a Set Values action

    • If false (odd index) — add the value to an 'OddList'

    After the loop, you have two separate lists. The loop index in OmniStudio is zero-based: index 0 = Apple (even), index 1 = Banana (odd), index 2 = Kiwi (even), index 3 = Mango (odd).


    Q20.  Why should we use Named Credentials instead of hardcoding endpoint URLs and credentials directly in Apex or Integration Procedures?

    Answer:

    Named Credentials are the recommended and secure way to manage external connections. Here is why:

    Security: Credentials (passwords, tokens, OAuth secrets) are stored securely by Salesforce and are never visible in code or debug logs.

    No hardcoding: The endpoint URL lives in a Named Credential record. If the URL changes, you update it in one place in Setup and all code using it automatically picks up the change without any deployment.

    Automatic authentication: Named Credentials support OAuth 2.0, Basic Auth, JWT, and more. Salesforce handles token generation and refresh automatically — you do not need to manage it manually.

    Environment flexibility: You can configure different Named Credentials for Sandbox vs Production, pointing to different endpoints. No code changes needed when moving between environments.

    Usage in Apex: HttpRequest.setEndpoint('callout:MyNamedCredential/path')


    Q21.  What is the difference between Named Credentials and External Credentials in Salesforce?

    Answer:

    These two concepts work together and were separated in newer Salesforce releases for better security control:

    External Credentials:

    • Define the authentication mechanism (OAuth 2.0, Basic Auth, Custom Header, etc.)

    • Linked to a Permission Set to control which users and components can use them

    • Think of it as: HOW to authenticate

    Named Credentials:

    • Define the endpoint URL of the external system

    • Reference an External Credential to know which authentication to use

    • Think of it as: WHERE to call

    In simple terms: Named Credential = URL. External Credential = the authentication details. Together they form a complete, secure external connection.


    Q22.  How can you make an Integration Procedure run asynchronously when it is called from inside an OmniScript?

    Answer:

    By default, an IP called from OmniScript runs synchronously (the user waits for it to complete). To make it asynchronous, you can use the following patterns:

    @future: Runs in the background with no response returned to the caller. Best for fire-and-forget operations.

    Continuation: Designed for long-running HTTP callouts. The server processes it asynchronously, but the user interface waits for the result. Prevents HTTP timeout errors for slow external APIs.

    Chainable IP: Executes steps in a chained sequence. Synchronous in behavior but avoids governor limit issues by distributing work across chained calls.

    Queueable: Runs the IP as a background Queueable job. Returns a Job ID immediately. Best for background processing that does not need to block the user.

    Queueable Chainable: Allows multiple IPs to execute sequentially as chained async jobs — ideal for large data processing pipelines.

    Choose based on need: Continuation if the user must wait for the result. Queueable if processing can happen fully in the background.


    Q23.  What is OmniOut in OmniStudio?

    Answer:

    OmniOut is an OmniStudio feature that allows you to deploy and run OmniScripts and FlexCards outside of Salesforce — on third-party websites, portals, or any external web application.

    With OmniOut you can embed OmniStudio components into:

    • External websites built on any technology stack

    • Partner or customer portals

    • Mobile applications

    • Any web app outside the Salesforce platform

    It works by generating an embeddable JavaScript snippet that loads the OmniScript or FlexCard from Salesforce into the external page.

    Use case: A company wants to use an OmniScript-based guided insurance quote process on their public website without giving visitors access to Salesforce.


    Q24.  In OmniScript, you are displaying 20 records. The 10th record has an error. How do you show the error for that one record while displaying the rest normally?

    Answer:

    The recommended approach is combining error-aware JSON responses with a Repeat Block in OmniScript.

    Step 1 - Handle errors in the IP or Apex:

    Process all 20 records. For any record with an issue, do not throw an exception — instead, include an error flag in the JSON for that record, for example: { hasError: true, errorMessage: 'Invalid date format' }

    Step 2 - Return the full list:

    Return all 20 records as a JSON array including the error fields alongside the normal data fields.

    Step 3 - Use a Repeat Block in OmniScript:

    Bind the response array to a Repeat Block that iterates over all 20 records.

    Step 4 - Conditional rendering inside the Repeat Block:

    Show an error message element when hasError == true. Show the normal data display when hasError == false.

    This gives users full visibility of all 20 records and clearly highlights which one has an error — without failing the entire operation.


    Q25.  What is the difference between Screen Flow and OmniScript?

    Answer:

    Both are guided process tools, but they differ significantly in power, performance, and licensing:

    Screen Flow (Salesforce Core):

    • Native Salesforce tool — no extra license required

    • Basic UI using standard flow screen components

    • Uses Flow elements: Get Records, Decision, Apex Action, Update Records

    • Each screen transition involves a server round trip, which can feel slower for complex processes

    • Best for: simple to moderate admin-built processes

    OmniScript (OmniStudio):

    • Part of the OmniStudio package — requires license (included in Industry Clouds)

    • Advanced guided UI: step indicators, rich layouts, dynamic branching, custom components

    • Uses DataRaptors, Integration Procedures, and Apex for data operations

    • Offloads processing to the server side, making complex flows significantly faster

    • Supports multi-channel deployment: Salesforce, Experience Cloud, and OmniOut (external sites)

    • Best for: enterprise-grade, high-performance guided processes with deep system integrations

    In short: Screen Flow is free and sufficient for simple use cases. OmniScript is the choice for complex, high-performance, multi-channel enterprise flows.


    Q26.  What are the common runtime errors in OmniStudio and what causes each one?

    Answer:

    Null Pointer Exception / Missing JSON Node:

    Caused by accessing a JSON path that does not exist or is null in the response. Fix: Always validate and handle null values before accessing nested nodes in your IP or Apex.

    Uncommitted Work Pending:

    DML was performed before an HTTP callout in the same transaction. Fix: Always place callouts before DML, or separate them into different transactions.

    Timeout Error:

    Processing exceeded the allowed execution time — caused by large payloads, slow external APIs, or hitting Apex execution limits. Fix: Use Continuation for long callouts, Queueable for large data, and paginate large datasets.

    101 SOQL / 151 DML Exceptions:

    Salesforce governor limits exceeded. Fix: Bulkify the code — move queries and DML outside of loops.

    Mixed DML Exception:

    Setup and non-Setup objects updated in the same transaction. Fix: Separate into different transactions using @future or Queueable.


    Q27.  What is the difference between Business Accounts and Person Accounts in Salesforce?

    Answer:

    Business Accounts (B2B):

    • Represents a company or organization

    • Contacts are created as separate records and linked to the Account

    • Stores company-level data: company name, industry, revenue, etc.

    • Available in all Salesforce orgs by default

    Person Accounts (B2C):

    • Represents an individual consumer — merges Account and Contact into one record

    • Behind the scenes, Salesforce auto-creates a hidden Contact linked to the Account

    • Shows both Account and Contact fields on the same record page

    • No separate Contact record is visible

    • Must be explicitly enabled by Salesforce Support or from Setup

    • Once enabled, it CANNOT be disabled

    Common use cases for Person Accounts: banks, insurance companies, healthcare providers, and any B2C business managing individual customers.


    Q28.  Why is data migration done using Bulk API or ETL tools like MuleSoft or Informatica instead of standard Salesforce tools?

    Answer:

    Standard import tools like Data Import Wizard or Data Loader have performance and scalability limitations when handling millions of records.

    Bulk API advantages:

    • Specifically designed for large-scale data operations — handles millions of records

    • Processes records asynchronously in batches — no UI timeout risk

    • Supports all DML operations: insert, update, upsert, delete, hard delete

    ETL tools (MuleSoft, Informatica, Talend) advantages:

    • Act as middleware — read data from source, transform/map it, and load it into Salesforce

    • Handle complex data transformations, deduplication, and field mapping

    • Built-in error logging, retry logic, and rollback capabilities

    • Connect multiple source and destination systems in the same pipeline

    • Enterprise-grade scheduling, monitoring, and audit trails

    When to use which: For direct high-volume loads into Salesforce, use Bulk API. For migrations involving complex transformations across multiple systems, use an ETL tool.


    Q29.  What is a DataMapper in OmniStudio, what are its types, and what are its limitations?

    Answer:

    A DataRaptor/Datamapper is an OmniStudio tool used to read from and write to Salesforce objects declaratively — without writing SOQL or DML code.

    Types of DataMappers:

    • DataMapper Extract — reads data from Salesforce (like SOQL SELECT)

    • DataMappers Load — writes data to Salesforce (like DML insert/update/upsert)

    • DataMappers Transform — transforms JSON from one structure to another without any database interaction

    • DataMappers Turbo Extract — a faster, simplified Extract for single-object queries with less configuration

    Limitations:

    • Does NOT support index-based logic — you cannot check 'is this the 5th record in the list'

    • Cannot execute complex conditional branching — use IP Loop and Conditional blocks for that

    • Not suitable for processing logic that depends on position in an array

    • Limited error handling compared to Apex

    Use DataMappers for straightforward data reads and writes. Use Integration Procedures with Loop and Conditional blocks when you need complex logic.


    Q30.  What causes a Timeout Error in Integration Procedures and how do you resolve it?

    Answer:

    A Timeout Error occurs when an Integration Procedure or its called components take longer than the allowed execution time.

    Common causes:

    • Large payload — processing hundreds or thousands of records in a single synchronous IP call

    • Slow external HTTP endpoint — the third-party API takes too long to respond

    • Heavy Apex logic or unoptimized SOQL inside a Remote Action

    • Hitting Salesforce synchronous execution limits (10 seconds for synchronous Apex)

    Solutions:

    • Use Continuation for long-running HTTP callouts to offload the wait and prevent UI timeouts

    • Move heavy data processing to Queueable or Batch Apex

    • Paginate large datasets — process in chunks instead of all at once

    • Optimize SOQL queries and reduce payload size

    • Use asynchronous IP patterns (Queueable, Future) when the user does not need to wait for the result


    Q31. What are the Lifecycle Phases of an LWC component?

    Answer:

    LWC components go through 5 lifecycle phases in this order:

    1. Component is being created — constructor() fires. Component instance is initialized in memory.

    2. Component is inserted into the DOM — connectedCallback() fires. Component is now part of the page.

    3. Component is rendered / re-rendered — render() and renderedCallback() fire. Happens on first load and every time reactive properties change.

    4. Component is removed from the DOM — disconnectedCallback() fires. Component is taken off the page.

    5. Error thrown in a child component — errorCallback() fires. Parent component catches the child's error.

    Difference between Lifecycle Hooks and Lifecycle Phases?

    • Phases = the stages a component goes through (the WHAT)

    • Hooks = the JavaScript functions you write to react to those phases (the HOW)


    More Questions Coming Soon!

    Bookmark this page — it is updated regularly with new questions based on real interview experiences.

    Visit vlocitygru.com for more Vlocity, OmniStudio & Salesforce guides.

    Recent Posts

    See All

    Comments

    Rated 0 out of 5 stars.
    No ratings yet

    Add a rating

    © 2024 VlocityGru Blog. All Rights Reserved.

    bottom of page