+ Start a Discussion
Sri549Sri549 

Apex --->regex too complicated Issue while File Uploading

Hello

When i am Uploading some data around 15000 records using apex.
i am facing this error "regex too complicated"
in this line,like when i split complete rows by new line after that doing the next steps.
When i upload lesser like 1000 round it is successfull.
filelines = nameFile.split('\n');
Could any one please help me out or suggest me any way to free out from this error.

Regards
Srinivas


 
Best Answer chosen by Sri549
Frédéric TrébuchetFrédéric Trébuchet
Hi,

Follow this link https://developer.salesforce.com/forums/ForumsMain?id=906F00000008xTLIAY.
Hope this helps.

Regards,
Fred

All Answers

Frédéric TrébuchetFrédéric Trébuchet
Hi,

Follow this link https://developer.salesforce.com/forums/ForumsMain?id=906F00000008xTLIAY.
Hope this helps.

Regards,
Fred
This was selected as the best answer
Frédéric TrébuchetFrédéric Trébuchet
Hi,

Does this answer help you solved your problem?
If so, mark the question as Solved and kindly select the best answer.

Thanks,
Fred
Sri549Sri549
Hi

Thanks a lot for helping me and suggesting me a way.
But when upload 5000 and more records I am facing view state error problem,even after making the reading string variable with Trascient
Public Transcient Blob contentFile.
Could you please suggest me a way to get out from this error.
Thanks a lot again

Regards
Srinivas

 
Frédéric TrébuchetFrédéric Trébuchet
Thank's for the tag.
It seems that the view exceed the state size limit (135KB).
Have a look at https://developer.salesforce.com/forums/ForumsMain?id=906F000000099GbIAI.
Let me know if it was a good suggestion for my learning trail.

Regards,
Fred
Sri549Sri549
Hello Frederic,

Thanks a lot for kind support on me.
i am actually uploading data of 15000 records using apex using batch apex.
when i am uploading 5000 Records sheet-->it is inserting only 4975 Records.
when i am uploading 10000 Records sheet-->it is inserting only 9950 Records.
when i am uploading 15000 Records sheet-->it is inserting only 14875 Records.
many times i am getting same result at any time.
if i see in the debug logs it is showing like 199 Records have been Inserted/Updated.because the batch size is 200.
there is no issue with data.there is no exception as well,but why the single record is skipping
Could you please help me how to go out with this issue.
Thanks in advance.

Regards
Srinivas
Frédéric TrébuchetFrédéric Trébuchet
Hi,
25 records per buch of 5,000 are missed or, in other words, 199 per buch of 200 are treated...
Are you sure the list that contains your records is 200 lines long?
Maybe a mistake with your counter initialization or comparaison.
Often bigger are the errors, harder the are to discover.

Else, if you are sure of your code, I suggest you to open a new question for this specific case.
Let me know.

Regards,
Fred
Sri549Sri549
Hello,
Thanks for your help .this has been solved.
Could you please tel me what max size we can upload the data like Max Size.
I mean to say is i would like to upload 75000 records at a same time.Is that effortable.


Thanks
Srinivas
 
Sri549Sri549
Hello,
When i am uploading 45000 around records i am getting this

System.LimitException: Batchable instance is too big: 
Could you please suggest a way..
to properly insert all records.


Regards
Srinivas
Sri549Sri549
Hello Frederic,

   Hope ypur doing good,
Sorry to disturb you again.As i am working with file uploading functionality with 15000 records.
When i am processing the using Apex Batch.It is giving Apex CPU time limit exceeded to only single batch and none of others were disturbing the flow.
I am just doing an string operation like with in column,if there are multiple values saying that test,test ,this replaces with space.

Regards
Srinivas
 
 
Frédéric TrébuchetFrédéric Trébuchet
Hi Srinivas,

String operations on large strings can be CPU consuming. If you have many strings concatenation and/or use many regular expressions to change your content, then it can be the cause of your problem.
I suggest you to try with a small file in which you know the transformation will occur to validate your algorithm.
If it works as expected, try transform your "big file" in "some smaller files" (for example 5,000 records each).
An other way could be to realize the transformation before to go with the file in your batch (i.e. outside of Salesforce).

Hope this helps,
Fred