+ Start a Discussion
Diogo DiasDiogo Dias 

Component for importing data via CSV file

Hello guys, I would like to know if anyone has gone through this problem that I am facing. I'm new to Lightning/Apex technology, so I need help.

I'm creating a component where I can import information via CSV. With this file, I convert it to a JSON file and send it to Apex, to deserialize it for some specific object, for example a list of Accounts or Tasks.
  • Lightning:
let jsonFile = JSON.stringify(csvFile);
  • Apex:
List<Account> listAccounts = (List<Account>) JSON.deserialize(jsonFile, List<Account>.class);

The function I created in Apex is with the annotation "@Future", because I don't need the function's return to know if the import was successful or not, because I inform the user that he/she will receive an email with all the success or error information.

NOTE: The CSV file can contain several lines and columns, causing the conversion to JSON to become a String with a very long length. The file must have at least one record and at most 10,000 records to import.

These are my doubts:
  • Instead of sending JSON in String format, do I have to send in Blob format? Is it possible?
  • Can I deserialize a Blob for a list of objects, for example List<Account> or List<Task>?
  • I send an email with two attachments, a successful CSV (with the Id's that were generated when they were inserted) and an error CSV. But my fear is that both CSV's will get too big in String format, because I know that there is a maximum String size in Apex. Is there another way to attach the information instead of being done via CSV?
NOTE: My solution is working, when the CSV has a maximum of 325 records with 50 columns, the problem is when I add more records. As if Apex had any limitations on converting and inserting data. Some erros that have already happened were the following:
  • Apex CPU time limit exceeded: The Maximum CPU time on the salesforce servers - 10,000 milliseconds (Synchronous limit) 60,000 milliseconds(Asynchronous limit);
  • Apex heap size too large: Salesforce enforces an Apex Heap Size Limit of 6MB for synchronous transactions and 12MB for asynchronous transactions;
Sitarama MurthySitarama Murthy
Hi Dlogo,

After user uploads .CSV file contining more then 350records, pass that file(JSON or BLOB) format to Batch Apex to handle the big data instead of using @future apex method.

Thanks,
Ram
 
Diogo DiasDiogo Dias
Hi Sitarama, how are you?

Please, could you show me an example? Have you ever done any activity similar to the one I'm doing?

Best regards,
Diogo