Announcing the Saviynt Knowledge Exchange unifying the Saviynt forums, documentation, training,
and more in a single search tool across platforms. Read the announcement here.

Schema User Import Job Failed with java.lang.OutOfMemoryError

LokeshSoundar
New Contributor
New Contributor

Hi Team,

 

We have recently started the integration in the Saviynt Dev Tenant for the first time for our client. We are testing the users import via Schema user import job. We have just added one user in the csv file and we have added the SAV file with some transformation in to the File Directory. After running the user import job, we are getting below error.

","","","2023-08-10T08:54:04.169149278Z stdout F 2023-08-10 08:54:04,168 [quartzScheduler_Worker-3] ERROR listeners.ExceptionPrinterJobListener - Exception occured in job: Schema.SchemaUserJob"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169188078Z stdout F org.quartz.JobExecutionException: Java heap space [See nested exception: java.lang.OutOfMemoryError: Java heap space]"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169192778Z stdout F at org.quartz.core.JobRunShell.run(JobRunShell.java:199)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169195778Z stdout F at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:546)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169197878Z stdout F Caused by: java.lang.OutOfMemoryError: Java heap space"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169200578Z stdout F at java.util.Arrays.copyOf(Arrays.java:3181)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169202879Z stdout F at java.util.ArrayList.grow(ArrayList.java:267)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169205679Z stdout F at java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:241)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169207879Z stdout F at java.util.ArrayList.ensureCapacityInternal(ArrayList.java:233)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169210079Z stdout F at java.util.ArrayList.add(ArrayList.java:464)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169212279Z stdout F at grails.converters.JSON.parse(JSON.java:283)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169214479Z stdout F at com.saviynt.ecm.services.ImportSAvDataUserService.doImportDataPreprocessing(ImportSAvDataUserService.groovy:168)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169216879Z stdout F at com.saviynt.ecm.services.ImportSAvDataUserService.importDataFromFile(ImportSAvDataUserService.groovy:728)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169219179Z stdout F at com.saviynt.ecm.services.ImportSAvDataUserService.importDataFromFile(ImportSAvDataUserService.groovy:682)"
"2023-08-10T08:54:04.833+00:00","ecm-worker","","","","2023-08-10T08:54:04.169222379Z stdout F at SchemaUserJob$_execute_closure2_closure8.doCall(SchemaUserJob.groovy:213)"

 

Below is our CSV file,

Personal Reference,Full Name,Position,Reporting Unit,Department,Function,Category:Posn,Post,Payroll,Reporting Manager email,Reporting Manager Staff ID,User Email,Joining Date:People,Leaving Date:People,Current Absence Start Date,Current Absence End Date,Current Absence Group,Current Absence Type,Personal Email,Mobile Number,Forename1:People,Surname:People,Site,CostCode:Posn
100018,xxxxx xxxx,Team Manager,Customer Voice Dept | Swansea @Client Group House,Customer Loyalty Voice,Customer Loyalty,Employee,Team Manager,xxx Limited,xxxxxxxxxxxx@xxxxxx.co.uk,100017,xxxx.xxxxx@xxxxx-test.co.uk,12/08/2023,,,,,,xxxxxx.xxxxxx@abc.com,7777777777,xxxxx,xxxxx,London,2SGWQU

Attached the SAV File.

We can see from the logs that both SAV and Users file got picked by the job. Kindly help us to resolve this issue.

Thanks,

Lokesh

[This message has been edited by moderator to mask PII. Note: Please make sure to mask any PII info before posting in public forums. Thanks]

10 REPLIES 10

Sivagami
Valued Contributor
Valued Contributor

Hey,

Please share the SAV file sample used. Make sure there any confidential information is replaced with dummy values before sharing.

-Siva

LokeshSoundar
New Contributor
New Contributor

Hi Siva,

PFA the file.

Thanks,

Lokesh

Hi @LokeshSoundar ,

Make sure number of column and the order/sequence is same in your CSV and SAV files.


Pandharinath Mahalle(Paddy)
If this reply answered your question, please Accept As Solution to help other who may have a same problem. Give Kudos 🙂

Hi,

Yes. We have validated the same. It looks fine. I have restarted the instance and tried again as this is memory leak issue. But still it didnt fix the issue.

Thanks,

Lokesh

dgandhi
All-Star
All-Star

The error is related to memory, can you restart the instance and retry again and make sure enough memory is available in the server.

Thanks,
Devang Gandhi
If this reply answered your question, please Accept As Solution and give Kudos to help others who may have a similar problem.

Hi Devang,

 

We have restarted the instance and tried again, still the same issue. We are just trying to import only one user and this is a new instance and a new implementation. This is the first integration that we are setting up and doesn't have any other job running.

 

Thanks

Lokesh

rushikeshvartak
All-Star
All-Star

Can you column case same in input file & sav file.

EMPLOYEEID is not found in your input file. 


Regards,
Rushikesh Vartak
If you find the response useful, kindly consider selecting Accept As Solution and clicking on the kudos button.

Hi Rushikesh,

I have checked in my CSV and SAV file. I can see the employeeid in the first column. Personal Reference in my csv file is mapped to EmployeeID in the SAV file.

Thanks,

Lokesh

LokeshSoundar
New Contributor
New Contributor

Hi Team,

Please let me know if there is any update on this.

Since I tried restarting the instance and data mapping seems fine, may I know what is the next step here?

This is related to java memory issue. So please help resolve it from Saviynt end.

Thanks,

Lokesh

Remove colon : from column name "Date:People"


Regards,
Rushikesh Vartak
If you find the response useful, kindly consider selecting Accept As Solution and clicking on the kudos button.