Tuesday, December 16, 2014

Ubuntu 14.04 + apple time capsule


Finally got access to time capsule. 

1) install libraries:
sudo apt-get install libgcrypt11 libgcrypt11-dev libgmp3-dev readline-common libreadline6 libreadline6-dev libfuse2 libfuse-dev libncurses5-dev

2) download afpfs (link here)
3) unarchive afpfs ( tar -xzvf )
4) navigate to afpfs (cd afpfs*)
5) you will have to patch the afpfs because it has a bug in ubuntu 12.04 /13.04 and still in 14.04:
wget -O offset.diff https://bugzilla.redhat.com/attachment.cgi?id=397155 && patch -p1 < offset.diff
wget -O afpfs-ng-0.8.1-pointer.patchhttps://bugzilla.redhat.com/attachment.cgi?id=505576 && patch -p1 < afpfs-ng-0.8.1-pointer.patch
6) supposing you are in afpfs folder you archived:
sudo configure --prefix=/usr
sudo make install
7) now you should be able to connect to your timecapsule using this:
mkdir ~/capsule
8) Command for MOUNT: 
mount_afp afp://user:password@server_host_or_ip/Data ~/capsule
9)Command for UNMOUNT:
afp_client unmount ~/capsule


You should see a “capsule” folder in your home folder.
In my case Time Capsule stands on 10.0.1.1

Thanks to alex!

Wednesday, December 10, 2014

Scala1: Scala is craaazy

I finished scala course provided by Martin Odersky.  And just want to test them in some real case scenario. I had quite easy task: there is csv file, with headers, which need to be imported to database. Normal times when I used java I prepared myself for few days of coding. Typical steps are:

  1. csv: find/list java csv processing libraries
  2. csv: read / investigate few of them
  3. csv: start playing, when one do not work as expected - check another one
  4. db: fight with proper driver
  5. db: find how to connect
  6. db: prepare statements/sqls/ etc..
  7. finally connect csv+db into one solution, and make it work smoothly

And after few days I would have working solution with probably few hundred lines of code...(not to mention tests/documentation).

In scala (and I am novice):
  1. created my single object "importer.scala"
  2. whole csv solution finished with adding to my build.sbt file single line:

libraryDependencies += "com.github.tototoshi" %% "scala-csv" % "1.1.2"

as my csv file using ";" as separator I put :
  implicit object MyFormat extends DefaultCSVFormat {
    override val delimiter: Char = ';'  }

then to main function I added 2 lines of code:
    val reader = CSVReader.open("myfile.csv")(MyFormat)
    val full = reader.allWithHeaders

and ... that is it - whole csv solution done in less than 15 mins! (and yes in java I also used sbt like solution - i.e.:maven)

  3. DB part: 
Connection - again to build.sbt, added just one line (I'm using MySQL):
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.34"

..  and again to main function just added database connection
 val dbc = "jdbc:mysql://localhost:3306/mydb?user=me&password=123"
    Class.forName("com.mysql.jdbc.Driver")   
val conn = DriverManager.getConnection(dbc)

and coding ... lets go ;-) :
    try {
      full.foreach(fields => processFields(fields, conn))
    } finally {
      conn.close
    }


I know that I could do everything inside "full.foreach" but I had some old java habits to create methods for different actions... so I just added processFields function.

  def processFields(in: Map[String, String], conn: java.sql.Connection) {
    val l: MutableList[String] = new MutableList[String]()
    fieldTypes.foreach(f =>
      l += (in.get(f).get).replace("\"", "\\\""))
    val sql = "INSERT INTO temp_products (" + myTypes.mkString(",") +      ") VALUES ( " + "\"" + l.mkString("\",\"") + "\")"
    val prep = conn.prepareStatement(sql)
    prep.executeUpdate
  } 

As you see I had issue with ["] sign - which sometimes appeared inside csv fields.
This is plain/native sql - and whole solution takes me less than 2 hours.

Must say: I was little disappointed with ... so little effort to make things work.
And all I had is 60 lines of code - I really don't see reason to write comments/documentation as this is so obvious (if you know scala) that basically anything more will be overhead.

In next post I will show you about slick - proper DB/ORM mapping in scala and .. how easy is to do reverse engineering - generate objects from DB tables and do some crazy sqls without sql ;-)

Web 3 - blockchain layers

Layers from a blockchain perspective. My plan is to write 5 articles:  1 Intro: Web 1.. 2.. 3.. 2 Layers in crypto.  [this one] 3 Applicatio...