0

I'm looking into the best solution of how to sync products (items), invoices, sales orders & customers on a daily basis. I've had other portals that sync almost 1M records in a CSV every night and run flawlessly.

Netsuite offers APIs but they all seem to be riddled with limits, and you're reliant on RESTful calls which I've found are great for a few rows, or updates, but not for when you're trying to get 100K - 1M rows.

I'm leaning towards what has worked with large sets of data and that is:

  • Database connector into NetSuite (eg OBDC)
  • CSV Export (somehow... I heard you can do saved searches - eg create "all customers" search, and then automate this search every night to export to a CSV)

Any help appreciated. All the NetSuite APIs look good, but I just don't see how they can handle daily sets of such large data reliably, and without hitting limits, timeouts or any of that.


EDIT 1: I have read here that some people pay for the (expensive) SuiteAnalytics Connect add-on module and got ODBC connection access to their data.

They also say you can probably do this through RESTlets and SuiteTalk but large data just won't be reliable I don't believe.

EDIT 2: Schedule saved search which sends CSV as attachment here and Place CSV in File Cabinet folder. From external server call webservices or RESTlet to grab new CSV files in the folder.

EDIT 3: NetSuite Migrations

EDIT 4: netsuite suitescript 2.0 export(csv)

Jonathan Bird
  • 313
  • 5
  • 19

1 Answers1

2

I wrote a tool for pushing csv’s from saved searches to S3. See bundle 271853 or https://github.com/DeepChannel/netsuite-savedsearch-s3

It can trigger and then transfer very large results.

Once you have the data out you can use faster ETL tools to load the data into your BI system

bknights
  • 14,408
  • 2
  • 18
  • 31
  • That's private :( can't see it mate. Thats sounds perfect and exactly what I need though. – Jonathan Bird Apr 03 '20 at 06:44
  • odd. I guess the customer just used the defaults when I transferred it. In the meantime I added the bundle Id. The code is all included in the bundle as well. – bknights Apr 03 '20 at 15:58
  • @Jonathan The repo is public again. – bknights Apr 03 '20 at 16:52
  • Impressive. Thanks a lot. Saves me having to write them all and can focus on the ETL – Jonathan Bird Apr 06 '20 at 03:51
  • We'll be trying this in the near future, did you ever hit any limits with this script to pull CSVs with millions of rows? – Jonathan Bird Jun 24 '20 at 07:07
  • I didn't while I was dealing with that but that said if you are re-posting old data you'd potentially save considerably on network costs by loading your documents and then adding new information incrementally. Do you generate millions of rows a day? – bknights Jun 24 '20 at 22:03
  • Nah needs to deal with millions of rows a day. – Jonathan Bird Jun 25 '20 at 07:27