a quick primer on snowpark

Snowpark is a nice addition to the suite of features now available in snowflake. With snowpark we can now execute programs in snowflake without extracting the data to another environment ( think spark clusters, local desktop etc ) , instead we can quickly execute the program in snowflake and get the results. so the obvious question is how does this work internally.

Snowpark internally runs on docker containers ready to be accessed in virtual warehouses so it hides this complexity from us . We are essentially running the code using the snowpark library that enables this to happen. Just like Spark , this takes the advantage of Lazy evaluation , where the entire set of operation is only executed when an action is taken on the object. Just like Spark there is a set of containers that do these operations in the cloud without moving the data to your local machine . We are essentially moving the code to the cloud as opposed to moving the data to the code . This is such a powerful feature especially when it comes to dealing with a lot of data , we dont want to be moving data in and out of the cloud for any kind of transformation

susbsequent posts will cover how to use snowpark

Object in Scala

The keyword object in scala is used to define a singleton , in otherwords its a class with only one object and everybody uses that same object. This can be used for a lot of the utility classes.

In Java , you essentially have a to define a class with static object and you have to make the constructor private and then only expose a getobject or getinstance to return the single instance .

All this is avoided in scala by using the keyword object.

if you define a class with the same name as the object in the file , its called a companion object and the two can access each others private variable

https://docs.scala-lang.org/overviews/scala-book/companion-objects.html