Home Forums Kamanja Forums Data Science & Models Model size limit

This topic contains 7 replies, has 6 voices, and was last updated by  Archived_User1 1 year, 8 months ago.

  • Author
    Posts
  • #13284 Reply

    Archived_User7
    Participant

    Team

    We are facing this error while uploading a big model jar to be used via PMML API.Looks like there is a limit in place which is restricting the jar uploading to 50 MB or so per file.

    Can you confirm if this is a configuration that can be changed or will require code changes in the engine. This was working in ver 1.0.3 post a fix by Pokuri, but not working post upgrade to 1.1 release.

    We have a planned go-live event for tomorrow and hence would appreciate any quick help. Below is the stack trace.

    ERROR [PathChildrenCache-0] - Exception while processing event from zookeeper ZNode /fatafategy/monitor/engine, reason null, message null
    
    ERROR [PathChildrenCache-1] - Exception while processing event from zookeeper ZNode /fatafategy/monitor/metadata, reason null, message null
    
    ERROR [PathChildrenCache-0] - Exception while processing event from zookeeper ZNode /fatafategy/monitor/engine, reason null, message null
    
    ERROR [PathChildrenCache-1] - Exception while processing event from zookeeper ZNode /fatafategy/monitor/metadata, reason null, message null
    
    ERROR [main] - Failed to insert/update object for : TermsToTags-assembly-1.0.jar, Reason:null, Message:KeyValue size too large
    
    ERROR [main] -
    
    StackTrace:com.ligadata.Exceptions.InternalErrorException: Failed to Update the Jar of the object(pmml.searchtweet.000000000000000100): Failed to insert/update object for : TermsToTags-assembly-1.0.jar
    
            at com.ligadata.MetadataAPI.MetadataAPIImpl$.UploadJarsToDB(MetadataAPIImpl.scala:1313)
    
            at com.ligadata.MetadataAPI.MetadataAPIImpl$$anonfun$AddFunctions$2.apply(MetadataAPIImpl.scala:2355)
    
            at com.ligadata.MetadataAPI.MetadataAPIImpl$$anonfun$AddFunctions$2.apply(MetadataAPIImpl.scala:2355)
    
            at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    
            at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    
            at com.ligadata.MetadataAPI.MetadataAPIImpl$.AddFunctions(MetadataAPIImpl.scala:2355)
    
            at com.ligadata.MetadataAPI.TestMetadataAPI$.AddFunction(TestMetadataAPI.scala:243)
    
            at com.ligadata.MetadataAPI.TestMetadataAPI$$anonfun$23.apply$mcV$sp(TestMetadataAPI.scala:2488)
    
            at com.ligadata.MetadataAPI.TestMetadataAPI$.StartTest(TestMetadataAPI.scala:2576)
    
            at scala.com.ligadata.MetadataAPI.StartMetadataAPI$.main(StartMetadataAPI.scala:73)
    
            at scala.com.ligadata.MetadataAPI.StartMetadataAPI.main(StartMetadataAPI.scala)
  • #13286 Reply

    Archived_User42
    Participant

    The max size allowed into Hbase column is controlled from Kamanja code.

    src code file: KeyValueHBase.scala
    src line: config.setInt(“hbase.client.keyvalue.maxsize”, 419430400);

    Will let the engineering team to weigh in the pros/cons of making the change, testing, and cutting a RC version.

  • #13294 Reply

    Archived_User64
    Participant

    In 1.1 (on the master branch), we have the following line (line 158 of KeyValueHbase.scala).

    config.setInt(“hbase.client.keyvalue.maxsize”, getOptionalField(“hbase_client_keyvalue_maxsize”, parsed_json, adapterSpecificConfig_json, “104857600”).toString.trim.toInt);

    This allows for up to 100mb files to be uploaded by default. I have tested this using 75mb, 100mb, and 105mb files (the 105mb file failed to upload).

    If you need a larger file uploaded than 100mb, you can add the following line to your MetadataAPIConfig.properties file:

    ADAPTER_SPECIFIC_CONFIG={“hbase_client_keyvalue_maxsize”:”419430400″}

    This will set the max size for hbase to 400mb. I tested this locally and I am able to upload files as large as 400mb.

    As to why it’s failing to upload a 50mb file, I’m not sure. Version 1.1 contains the code necessary to perform this operation. Are you certain it’s 1.1 you’re on? I took a look at the line number of MetadataAPIImpl.scala:1313 and the exact def UploadJarsToDB isn’t there. Instead, UploadJarToDB (notice the lack of plurality), is there instead. The catch statement for UploadJarsToDB (plurality is back) is on line 1292.

    • #13295 Reply

      Archived_User1
      Participant

      Just a reminder this forum is the right place to post and get all issues resolved. This is our first line of support available 24-by-7 and monitored by engineerings teams from different time zones.

Reply To: Model size limit
Your information: