0 Replies Latest reply on Jul 25, 2013 7:05 AM by Naveen Dhayalan

    Issue in Updating properties in a Asset Metadata node!!!

    Naveen Dhayalan

      Hi All,


      I have a following requirement ,

               To import a pdf file along with a xmlfile (has metadata information) to CQ5.4 using webdav from an external system. So i followed the below step to achieve it.


      1. Launcher which listens to a specified path and with a condition jcr:content/mimetype==application/xml file to process only xml. This xml file have the information about pdf file name (needs to be imported) and other meta data information thats must be added to asset metadata node.

      2. Implemented a workflow to read the xml file and get the pdf file based on the name. Using AssetManager  service created the asset under "/content/dam" path, here is the code


              AssetManager manager = resolver.adaptTo(AssetManager.class);

              Property data = NodeHelper.getJcrNode(srcNode).getProperty("jcr:data");


              // assetName - retrieved from the xml file

              String targetAssetPath = targetNode.getPath() + "/" + assetName;

              Resource existingAsset = resolver.getResource(targetAssetPath);

              if (existingAsset != null) {

                 // if already exists just remove it

                  Node parent = existingAsset.adaptTo(Node.class).getParent();





              String mimeType = new MimetypesFileTypeMap().getContentType(srcNode.getName());     

             Asset asset =  manager.createAsset(targetAssetPath, data.getBinary().getStream(), mimeType, true);


      3. Once the asset is created then i have to update the metadata node and stop the ExtractMetadataProcess step from DAM update asset workflow process not to override the metadata from the xml file.


      Update metadata node: I have to wait for Asset synchronizaion between "var/dam" and "content/dam" to complete then get the metadata node to update.   To  achieve i have implemented to wait for a fews minutes then throw an timeout exception and delete the asset then author get notified for the failure.Different solution or any suggestion may help??


      Skip ExtractMetadataProcess: During the workflow  import process i included a flag to asset jcrcontent node -  only for the files imported using webdav. I created a new Metadata process which replaced ExtractMetadataProcess in the dam update workflow process -  to read the flag form the jcr content node and skip it only for these files. here is the code.


      LOG.debug("Get value of property [{}] of jcrNode: [{}]", "import", jcrNode);

                  boolean import= NodeHelper.getBooleanPropertyFromNode(jcrNode, "import");

                  LOG.debug("isimport: [{}]", import);

                  if (!import) {

                      // import via WCM process metadata extraction step

                      Asset asset = getAssetFromPayload(item, session);

                      if (null != asset) {


                          AssetHandler handler = getAssetHandler(asset.getMimeType());

                          ExtractedMetadata metadata = handler.extractMetadata(asset);

                          metadata.setMetaDataProperty("dam:extracted", Calendar.getInstance().getTime());

                          saveMetadata(asset, metadata);

                      } else {

                          LOG.error(LogMessageHandler.getInstance().getMessageOutOfKey("P07200000001"), item.getWorkflowData().getPayload().toString(),





      Becuase of the new  process - thumbnailcreation stopped working. But no error log!!! Check the model config.


      Model config:



                  description="Extracts XMP, DC, etc. for all.formats. If it is import then avoid metadata extraction step."

                  title="Metadata Extraction"










                  description="Image &amp; document thumbnails"

                  title="Thumbnail creation"









      4. Critical problem faced in the above implementation,

             Works fine for less number of  files. But when we have huge load (maybe 1000 files) then many workflow moves to STALE state and it blocks in processing further.

      Please suggest any workaround or completely new solution .