Error data="E_PACK_NO_TARGET_LOCATION"

Hello, i have the data="E_PACK_NO_TARGET_LOCATION" error when i am trying to pack a pdf book.
[root@www stores]# java -Xmx1024M -jar /var/lib/ebooks/uploadtest-1_2.j
ar http://www.xxx.xx:8080/packaging/Package 1265369812indice.pdf -pass xxxxxx
Creating package request for: 1265369812indice.pdf
Creating connection to Packaging Server: http://www.xxx.xx:8080/packaging/Pa
ckage
Sending Package Request
There was an error with the Package Request
<error xmlns="http://ns.adobe.com/adept" data="E_PACK_NO_TARGET_LOCATION http://
www.xxx.xx:8080/packaging/Package"/>
Finished!
Successful packages created: 0
Unsuccessful package attempts:1
Here are the files that failed to package:
1265369812indice.pdf
Could you help me?
Thanks¡

I have this packaging.conf:
com.adobe.adept.persist.sql.driverClass=com.mmysql.jdbc.Driver
com.adobe.adept.persist.sql.connection=jdbc:mysql://127.0.0.1:3306/adept
com.adobe.adept.persist.sql.user=content
com.adobe.adept.persist.sql.password=xxxx
com.adobe.adept.persist.sql.dialect=mysql
com.adobe.adept.log.level=trace
com.adobe.adept.log.file=/var/log/fulfillment.log
com.adobe.adept.packaging.baseLocation=/var/lib/ebooks
com.adobe.adept.packaging.baseURL=http://www.xxxx.xx:8080/ebooks
com.adobe.adept.serviceURL=http://www.xxxx.xx:8080/packaging
And this is my error again
1265369812indice.pdf
[root@www stores]# java -Xmx1024M -jar /var/lib/ebooks/uploadtest-1_2.jar http://www.xxxx.xx:8080/packaging/Package 1265369812indice.pdf -pass xxxx
Creating package request for: 1265369812indice.pdf
Creating connection to Packaging Server: http://www.xxxx.xx:8080/packaging/Package
Sending Package Request
There was an error with the Package Request
<error xmlns="http://ns.adobe.com/adept" data="E_PACK_NO_TARGET_LOCATION http://www.xxxx.xx:8080/packaging/Package"/>
Finished!
Successful packages created: 0
Unsuccessful package attempts:1
Here are the files that failed to package:
1265369812indice.pdf
Thank you¡

Similar Messages

  • Error " Data missing for the entry check while creating a new waste code

    Hi all, While setting a new Waste code I get the error " Data missing for the entry check, correction:". while filling the NAM- WASTECOCAT - LER item.
    This sould look for the catalog's name included in the phrase set but for some reason it doesn't find it giving me this error.
    I am changing original Characteristics, phrase set, classes, and value assignment type. Just to have my own estructure with Znames for all of them.
    I have also change the enviroment parameter "WAM_PHRSET_WACATLG" with the name of my phrase set.
    I have checked everything several times watching for typos or looking for a missing step.
    I have even tried including my new Z's characteristics in the classe and living the original SAP_EHS_1024_001_WASTE_CATALOG. (changing the enviroment parameter WAM_PHRSET_WACATLG to SAP_EHS_1024_001_WASTE_CATALOG) and it works.
    I will like to change this characteristic by Z_EHS_WA_WASTE_CATALOG
    Phrase set to Z_EHS_WA_WASTE_CATALOG.
    enviroment parameter WAM_PHRSET_WACATLG= Z_EHS_WA_WASTE_CATALOG
    After matching up the master data It should work fine but I might be missing something to get it running ok.
    ¿Any idea?
    Regards,
    Alvaro

    Hello Juan Carlos, the value and class that I want to duplicate and doesn't work is for Waste Code, I have also duplicated the one you have displayed (waste pproperties) without any problem.
    1.I have duplicated and changed class SAP_EHS_1024_001. to Z_EHS_WA
    2. Create a copy of the 5 characteristics included in this class.
    SAP_EHS_1024_001_WASTE_CATALOG
    SAP_EHS_1024_001_WASTE_CODE
    SAP_EHS_1024_001_WA_SUBCATEG
    SAP_EHS_1024_001_WA_CATEGORY
    SAP_EHS_1024_001_REMARK
    change the name by
    Z_EHS_WA_WASTE_CATALOG
    Z_EHS_WA_WASTE_CODE
    Z_EHS_WA_SUBCATEG
    Z_EHS_WA_CATEGORY
    Z_EHS_WA_REMARK.
    I checked the funcion C14K_WASTECATLG_CHECK is in the value of the Z_EHS_WA_WASTE_CODE characteristic
    I checked the funcion C14K_WASTECODE_CHECK is in the value of the Z_EHS_WA_WASTE_CATALOG characteristic
    3. Create phrase sets for each new category. with same name.
    4. Match up the master data.
    5. Change the enviroment parameter.to Z_EHS_WA_WASTE_CATALOG
    I think I have followed all the steps, but for some reason it doesn't find the catalog
    The phrase for the catalog is EWC in english and LER in spanish.
    Regards
    Alvaro.

  • Job Cancelled with an error "Data does not match the job def: Job terminat"

    Dear Friends,
    The following job is with respect to an inbound interface that transfers data into SAP.
    The file mist.txt is picked from the /FI/in directory of the application server and is moved to the /FI/work directory of application server for processing. Once the program ends up without any error, the file is moved to /FI/archive directory.
    The below are the steps listed in job log, no spool is generated for this job and it ended up with an error "Data does not match the job definition; job terminated".Please see below for more info.
    1.Job   Started                                                                               
    2.Step 001 started (program Y_SAP_FI_POST, variant MIST, user ID K364646)                           
    3.File mist.txt copied from /data/sap/ARD/interface/FI/in/ to /data/sap/ARD/interface/FI/work/.
    4.File mist.txt deleted from /data/sap/ARD/interface/FI/in/.                                   
    5.File mist.txt read from /data/sap/ARD/interface/FI/work/.                                    
    6.PD-DKLY-Y_SAP_FI_POST: This job was started periodically or directly from SM36/SM37 (Message Class: BD, Message Number : 076)  
    7.Job PD-DKLY-Y_SAP_FI_POST: Data does not match the job definition; job terminated (Message Class : BD, Message No. 078)    
    8.Job cancelled after system exception
    ERROR_MESSAGE                                                
    Could you please analyse and come up about under what circumstance the above error is reported.
    As well I heard that because of the customization issues in T.Code BMV0, the above error has raised. 
    Also please note that we can define as well schedule jobs from the above transaction and the corresponding data is stored in the table TBICU
    My Trials
    1. Tested uplaoding an empty file
    2. Tested uploading with wrong data
    3. Tested uploading with improper data that has false file structue
    But failed to simulate the above scenario.
    Clarification Required
    Assume that I have defined a job using BMV0. Is that mandatory to use the same job in SM37/SM36 for scheduling?
    Is the above question valid?
    Edited by: dharmendra gali on Jan 28, 2008 6:06 AM

    Dear Friends,
    _Urgent : Please work on this ASAP _
    The following job is with respect to an inbound interface that transfers data into SAP.
    The file mist.txt is picked from the /FI/in directory of the application server and is moved to the /FI/work directory of application server for processing. Once the program ends up without any error, the file is moved to /FI/archive directory.
    The below are the steps listed in job log, no spool is generated for this job and it ended up with an error "Data does not match the job definition; job terminated".Please see below for more info.
    1.Job Started
    2.Step 001 started (program Y_SAP_FI_POST, variant MIST, user ID K364646)
    3.File mist.txt copied from /data/sap/ARD/interface/FI/in/ to /data/sap/ARD/interface/FI/work/.
    4.File mist.txt deleted from /data/sap/ARD/interface/FI/in/.
    5.File mist.txt read from /data/sap/ARD/interface/FI/work/.
    6.PD-DKLY-Y_SAP_FI_POST: This job was started periodically or directly from SM36/SM37 (Message Class: BD, Message Number : 076)
    7.Job PD-DKLY-Y_SAP_FI_POST: Data does not match the job definition; job terminated (Message Class : BD, Message No. 078)
    8.Job cancelled after system exception
    ERROR_MESSAGE
    Could you please analyse and come up about under what circumstance the above error is reported.
    As well I heard that because of the customization issues in T.Code BMV0, the above error has raised.
    Also please note that we can define as well schedule jobs from the above transaction and the corresponding data is stored in the table TBICU
    My Trials
    1. Tested uplaoding an empty file
    2. Tested uploading with wrong data
    3. Tested uploading with improper data that has false file structue
    But failed to simulate the above scenario.
    Clarification Required
    Assume that I have defined a job using BMV0. Is that mandatory to use the same job in SM37/SM36 for scheduling?
    Is the above question valid?

  • Error (Data mining): The specified mining structure does not contain a valid model for the current task.

    I'm trying to run the Cross validation report on a mining structure that contains just Microsoft Association Rules mining model. In Target Attribute, I've tried:
    Actual(Service Description).SE value
    Actual([Service Description]).[SE value]
    Actual(Service Description)
    Actual([Service Description])
    just because i don't know what is the exact correct format, but none of them worked, and I always get the following error:
    Error (Data mining): The specified mining structure does not contain a valid model for the current task.
    the following is my mining model structure

    Association rules does not allow for cross-validation
    Mark Tabladillo PhD (MVP, SAS Expert; MCT, MCITP, MCAD .NET) http://www.marktab.net

  • Error: Data cannot be inserted as there is no matching record

    Hi,
    I have migrated SharePoint 2010 site to SharePoint 2013.
    I have a custom list, which i was able to do edit multiple records after "opening in Access" in SharePoint 2010.
    After the migration, when i open the list in "open with Access", and try to modify any data, I am getting the error "Data cannot be inserted as there is no matching record". The custom list has columns which are taken from lookup list.
    I have tried the re-link the list, which has not helped.
    Found this: For some of the lookup colums, for which the data is empty, it is getting shown as 0.
    This is causing the issue.
    How to fix this?
    Thanks

    Hi Venkatzeus,
    According to your description, my understanding is that the error occurred when you open the list with Access after migrating SharePoint 2010 to SharePoint 2013.
    Did this issue occur with other lists?
    I recommend to create a new blank database in Access and then import the list into the new one to see if the issue still occurs.
    For test, I recommend to test the same scenario with a new list to narrow the issue scope.
    Best regards.
    Thanks
    Victoria Xia
    TechNet Community Support

  • In BAPI PO CREATION How to handled errors datas

    Hi friends ,
    In BAPI PO CREATION upload the datas How to handle/capture errors datas.?
    arun

    Hi,
    After completion of the program IT_RETURN table will have all the messages in it.
    Loop the IT_RETURN internal table and display the data.
    Regards
    Sudheer

  • SCOM 2012 R2 Reporting services installation error - data reader account

    I have successfully installed SCOM 2012 R2 and SQL 2012 SP1 on the same server. I was trying to install the reporting server piece of SCOM.
    When I get to the part that asks for a domain account, I get this error:
    "Data Reader account provided is not same as that in the management server"
    The account is not the same as the management server, and it does have read access to the data warehouse and reporting databases. Is there some other permission this account needs for reporting services?
    Thanks.

    Adding more info:
    I usually create the following account for SQL:
    AD accounts for the following to be used with SQL Services:
    SQL Agent for SQL Server Agent
    SQL Analysis Services for SQL Server Analysis
    SQL Database Engine for SQL Server Database Engine
    SQL Reporting Services for SQL Server Reporting Services
    For SCOM I create the following accounts:
    SCOM DR - Used to deploy reports
    SCOM DWR - for writing data from the management server to the reporting DW
    SCOM MSA - Performs default actions
    SCOM SDK - Used for updating/reading information from the operations Manager Database - Needs to be local administrator on the SCOM server
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • On closing Acrobat Windows detect an error "Data Execution Prevention has closed Adobe Acrobat".

    Hi,
    I developed a plugin which captures Before/After open, close and save events and perform some operation inside these events.
    I tested the plugin in Windows XP and Acrobat 9.0.0 and 10.0.0 and everything works fine. But when in test my plugin on Windows Server 2008 Standard 32 bits, then windows detect an error while closing the acrobat and shows the following message:
    "Data Execution Prevention has closed Adobe Acrobat"
    I am using Acrobat 10.0.0 for testing in Windows Server 2008 operating system. Also, debugging the program shows no error but plugin does not produce the desired result when tested on Windows Server 2008 Standard 32 bits operating system.
    Can anyone please giude me how to resolve this issue??
    Thanks in advance!

    Are you really using 10.0.0?  You've installed no updates to it??
    From: poortip87 <[email protected]<mailto:[email protected]>>
    Reply-To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>>
    Date: Mon, 6 Feb 2012 21:46:02 -0800
    To: Leonard Rosenthol <[email protected]<mailto:[email protected]>>
    Subject: On closing Acrobat Windows detect an error "Data Execution Prevention has closed Adobe Acrobat".
    On closing Acrobat Windows detect an error "Data Execution Prevention has closed Adobe Acrobat".
    created by poortip87<http://forums.adobe.com/people/poortip87> in Acrobat SDK - View the full discussion<http://forums.adobe.com/message/4190399#4190399

  • Error data file is empty in standard Copy package

    Hi,
    We have an appset that consists of four applications and we can't successfully run Data Manager Copy package in one of them. It launches the following tasks: Dump, Convert (execution fails in this step with error "Data file is empty") and Load. SSIS configuration in BIDS is defined by default and we haven't set any parameter.
    We have figured out this error appears when we select any member of a dimension in "COPYMOVEINPUT" prompt except for Time dimension. Previously there was a custom Copy package based on standard BPC and it only filters by Time dimension. Perhaps this error is related to application configuration to run custom package.
    We show code:
    INFO(%TEMPFILE1%,%TEMPPATH%%RANDOMFILE%)
    INFO(%TEMPFILE2%,%TEMPPATH%%RANDOMFILE%)
    TASK(DUMP,APPSET,%APPSET%)
    TASK(DUMP,APP,%APP%)
    TASK(DUMP,USER,%USER%)
    TASK(DUMP,FILE,%TEMPFILE1%)
    TASK(DUMP,SQL,%SQLDUMP%)
    TASK(DUMP,DATATRANSFERMODE,2)
    TASK(CONVERT,INPUTFILE,%TEMPFILE1%)
    TASK(CONVERT,OUTPUTFILE,%TEMPFILE2%)
    TASK(CONVERT,CONVERSIONFILE,%CONVERSION_INSTRUCTIONS%)
    TASK(LOAD,APPSET,%APPSET%)
    TASK(LOAD,APP,%APP%)
    TASK(LOAD,USER,%USER%)
    TASK(LOAD,FILE,%TEMPFILE2%)
    TASK(LOAD,DATATRANSFERMODE,4)
    TASK(LOAD,DMMCOPY,1)
    TASK(LOAD,CLEARDATA,%CLEARDATA%)
    TASK(LOAD,RUNTHELOGIC,%RUNLOGIC%)
    TASK(LOAD,CHECKLCK,%CHECKLCK%)
    Any variables as %CONVERSION_INSTRUCTIONS% aren't defined. Is it a system constant?
    Thanks.

    Hi Roberto,
    Thanks for having a look at my question.
    We're using .act files to upload data from SAP BW into SAP BPC.
    This is the content of  .act file that I'm trying to upload:
    ACTUAL
    1
    1
    GCN.CZN,2621.LC_.EUR,100.5000
    GCN.CZN,2621.TC_.CZK,7050.0000
    Transformation file looks like:
    Conversion files are:
    Time:
    Category:
    Entity:
    Counterpart:
    RCCinterco:
    IntercoCurr:
    Transcurrency:
    In case of any other info needed, please let me know.
    Thanks a lot in advance,
    Wai Yee Kong

  • Internal error: data not found message

    Every time I try to upload pics I get this message:
    internal error: data not found
    How can I fix this problem? Thank you so much, Stacey

    Please explain in detail about the process you are trying to perform...Do tell if any settings or any change in preferences you have made. How many files you rae trying to upload. Your account is not expired and you are admin etc.
    -Garry

  • Error - Data not found

    I experience error - data not found when i click on the custom 'Next' button on the page of my application, whereas, there are records in the table of the database. How can this be resolved?
    Thank you.

    User,
    What is your name?
    Can you put an example on apex.oracle.com and provide the workspace/username/password?
    Regards,
    Dan
    http://danielmcghan.us
    http://sourceforge.net/projects/tapigen
    http://sourceforge.net/projects/plrecur
    You can reward this reply by marking it as either Helpful or Correct ;-)

  • ERROR OGG-01148 programming error, data type not supported for column

    I am getting following error when I put null in insert statement
    2011-03-31 18:30:45 ERROR OGG-01148 programming error, data type not supported for column TXID in table advoss.tblaudittrail.
    I am replicating MySQL 5.5.9 to Oracle 11g rel2 via goldengate 11

    I am able to diagnose what is cuasing the problem
    unsigned flag was the culprit of this error
    I am able to insert null after removing unsigned flag.
    thank you very much for your kind support

  • Passing values to subreport in SSRS throwing an error - Data Retrieval failed for the report, please check the log for more details.

    Hi,
    I have the subreport calling from the main report. The subreport is based on MDX query agianst the SSAS cube. some dimensions in cube has values 0 and 1.
    when I try to pass '0' to the sub report as the parameter value, it gives an error "Data Retrieval failed for the report, please check the log for more details".
    Actually I am using table for storing these parameter values. In the main report I am calling this table (dataset) and passing these values to subreport.
    so I have given like [0],[1] and this works fine. when I give only either [0] or [1] then it is throwing an error.
    Could you please advise on this.
    Appreciate all and any help.
    Thanks,
    Divya

    Hi Divya,
    Based on the current description, I understand that there is no issue if you pass two values from main report to subreport, while the issue occurs when passing one value to subreport.
    To narrow down the issue, I want to confirm whether the subreport can run if there is only [0] or [1] in the subreport. If so, it indicates the query statements exist error in the subreport. If it’s not the case, this shows the issue occurs during passing
    values from main report to subreport. To make further analysis, please post the details of query statements of the subreport to the forum.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • Unable to connect to the server.. Error Data : API_NOT_INITIALIZED on CUEAC Server Installation

    Hi all,
    I am trying to install CUEAC Attendant Server on Windows Server 2008 R2, tried several times. First, it got stuck on the Databahse Wizard. After three reinstalls, it still got stuck but somehow built the configure database named CFGATT and then started installing TSP. Finally, the setup has been finished and I went to the WebAdmin page. After I entered the login information, it gave an error : "
    Unable to connect to the server.. Error Data : API_NOT_INITIALIZED"
    I searched through the forum and found this topic :
    https://supportforums.cisco.com/message/1215763
    Here  I found that there's a bug and I tried to do the fix myself as instructed :
    http://tools.cisco.com/Support/BugToolKit/search/getBugDetails.do?method=fetchBugDetails&bugId=CSCte44454
    However, it changed nothing, looking forward to your responses.
    Regards,
    Fikri

    Fikri,
    It is not working as this is not a supported environment, the current release is not supported on Windows Server 2008 R2.
    The next release (April 2013) will support that version of Windows Server.

  • Getting error "Data size bigger than max size for this type"

    Hi,
    I'm getting the error "Data size bigger than max size for this type: 122446" when am uploading the RTF template. RTF template size is 831 KB. Can anyone let me know what is the maximum size that an RTF template should be.
    Thanks

    Hi,
    Can you please let me know what could be the reasons for encountering the error. Am struck with this issue.
    Thanks.

Maybe you are looking for