RAR DATA Export issue

Hello,
When exporting data from RAR from 5.3 system using GRC migration link i am getting permission error, even i have administator role assigned to me,
and Local folder has full access.
/export_acd\CONNECTORSdata.dat (Permission denied)
Thanks,
Prasant

Prasanth,
While importing the .DAT files. I get the below error
"Failed to retrieve data due to failure reading dir"
Did you face this while migration. Any help would be appreciated.
Thanks,
Raghav

Similar Messages

  • Regarding web analysis data export issue

    Hi all,
    Iam new to web analysis we had a issue on one of the report
    I can open the report and i can work perfectly on the report but when i export the data to excel its showing up as fallowing error
    FATAL ERROR: gui report is inccorect
    In the same application i have some other reports those are working fine and data is being exported to excel for those reports
    What might be going with that report Please let me know if you have any ideas.
    Thanks,
    Ramesh

    As report was exceding 65000 records we have modified the report by adding the pages and reduced the count of output records. So the issue was resolved.

  • Data Exportation

    I am stumped on a Data Exportation issue. I thought I would post the issue on the forum to see if anyone can help me. We are using Planning 11.1.1.3.
    In essbase we had a calc script written to export data to a txt file. When I ran the calc script recently I noticed a department (dept 101) was not being included in the export file. Here is what has me stumped:
    In the Department dimension, dept 101 sit in the middle of dept 100 and dept 102. The export file pulls in dept 100 and dept 102. I have to think that the CS is wrritten correctly. There is nothing in the CS that singles out dept 101. The CS is written very general.
    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.I realize its probably hard to diagnose the issue without viewing the dimensions, export file, CS, etc. However, any comments/suggestions would be greatly appreciated. Thx!

    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.
    ---- Did you check the storage properities and looked over the possiblity of implied sharing ......(Just an area to watch out for .....Sounds simliar to such case ..But ya Dont have the outline so cant really tell..Just a genrall point..)
    -VD

  • Data Export to Oracle table from essbase issue

    Hello,
    i am using a data export calc script to load data from essbase into an oracle table. We have Essbase 11.1.2.1 on windows 2008 64 bit R2 server. I have an ODBC system DSN created for this job.
    However, when i launch this process i get a message in the log "Cannot read SQL driver name for [Backup Exec Catalogs] from [ODBC.INI]"
    i have checked the ODBC.ini file also in C:\Windows directory and that file contains the connection entries...
    Any thoughts as to why i am getting this error in one server whereas the same process on SIT server is working fine.
    thanks...

    please restart the application and try again . Check the application log if any details are there

  • Essbase Data Export not Overwriting existing data file

    We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
    The OverWriteFile option is set to ON.
    SET DATAEXPORTOPTIONS {
         DataExportLevel "Level0";
         DataExportOverWriteFile ON;     
    DataExportDimHeader ON;
         DataExportColHeader "Period";
         DataExportDynamicCalc ON;
    The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
    We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
    Any suggestions regarding this issue?

    Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
    This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
    Successful Test Case 1:
    1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
    e.g "Q2FCST-Phased"
    2) Ran the Script
    3) Result Ok.Script overwrote the file with Q2FCST-Budget data
    Successful Case 2:
    1) Put scenario in @member function
    e.g. @member("Q2FCST-Budget")
    2) Results again ok
    Failed Case:
    1) Deleted the file
    2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
    e.g. @member("&ODI_SCENARIO")
    3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
    We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
    Any clues anyone?

  • Automatise data export from website

    Hello,
    I stuck with an issue regarding data export from scopus.com. I have 61 keywords for which I want to export (as csv) number of documents by year, country, document type, subject area. Doing that by hand would be extremely tedious. Thus my question if this can be automatised. Unfortunately scopus does not provide an API to access the data directly.
    So, the process to be automatised would be:
    1. Open http://www.scopus.com/home.url
    2. Enter the keyword in "search for"
    3. Click on "search"
    4. Click on "Analyze results"
    5. Click on tab "year" and "export". Save the file
    6. Click on tab "country" and "export". Save the file
    6. Click on tab "document type" and "export". Save the file
    6. Click on "subject area" and "export". Save the file
    Do programs exist that help me retrieving the data?
    Thanks!

    You could achieve your goal with bash and wget with some combination of --save-cookies --post-data.
    But be aware that scopus is run by Elsevier, whom many view as the most evil of the publishing companies. Think twice before doing something that may annoy them.

  • Powershell Array Export Issues

    Hey everyone! So I might just be going about this the wrong way but here's my issue. The script functions as I need it to however, when I export the values to my .csv, the columns are in the wrong order than the way I think they should be. Here's what I've
    got:
    $list = Get-content targets.txt
    Foreach($_ in $list) {
    $SAM = Get-RegValue -ComputerName $_ -Key Software\Microsoft\Windows\CurrentVersion\Authentication\LogonUI -Value LastLoggedOnSAMUser
    $User = Get-RegValue -ComputerName $_ -Key Software\Microsoft\Windows\CurrentVersion\Authentication\LogonUI -Value LastLoggedOnUser
    @('Name','LastUserAccessed','LastUserLoggedOn')
    New-Object PSObject -Property @{
    Name = ($SAM).ComputerName
    LastUserAccessed = ($User).Data
    LastUserLoggedOn = ($SAM).Data
    } | Export-csv Results.csv -notypeinformation -append}
    I would like it to read Name, LastUserAccessed, and LastUserLoggedOn but it comes out to be LastUserAccessed, Name, LastUserLoggedOn. Any ideas as to why this is happening? Thanks in advance!

    Okay, then instead of using 'New-Object PsObject -Property', use [pscustomobject].
    http://blogs.interfacett.com/powershell-v3-object-creation-accelerator
    EDIT: Or you can just use [ordered] if you'd prefer:
    http://stackoverflow.com/questions/7802406/use-cases-of-ordered-the-new-powershell-3-0-feature
    Don't retire TechNet! -
    (Don't give up yet - 13,085+ strong and growing)

  • Date Extract Issue

    Hi,
    I have developed a sql based report in which the date fields are in the format to_char( 'DD-MON-YYYY').
    The tester wants to pull the data in the excel. When they did so, the date displayed as '01-May-2009' became '01-May-09'. I have informed the tested its morover a micrsoft date setup issue.
    But they are very specific about the requirement. Also they negither want to change the setup of the excel and nor want to perform any data formatting.
    Any idea is this posssible through Oracle pl/sql coding?
    Thanks,
    Roy

    ORARAHUL wrote:
    The tester wants to pull the data in the excel. When they did so, the date displayed as '01-May-2009' became '01-May-09'. When you click inside the cell, then the format would be '5/1/2009'. Excel has been designed in this way to behave with date types, If they want to see the date in the format you specified, ask them to change the type in "Format Cells" window or export the data into a text file.

  • Export issue

    I have been using LR for almost 2 years and have about 35,000 images in the program. I consider myself an advanced user and have never encountered an issue with the program. I am running v4.1 on Windows 7 and convert the raw images to DNGs when I upload from the card.
    Today I was exporting JPEGs to upload manually to my website. All went well for about 30 exports and then I opened another folder and noticed about 20 of 120 thumbnails were blank and about 40 had ! above the image. All open in LR but cannot be expanded to 100% view. I get an error message saying files are corrupted when I try to export. I see distorted images when I view them in Picasa.
    Any ideas?  I am ready to panic. Thanks.

    Lorendn
    As I mentioned in another thread, you are not the only person that has been struck with data integrity issues. This is quickly recovered when the problem comes from a bad cable for example, where one can see the issue immediately. But when it happens in silence (e.g. partial hard drive corruption), and over a long timeframe, it can have much deeper impact.
    I put a feature request forward to implement an automatic photo integrity check into Lightroom:
    http://feedback.photoshop.com/photoshop_family/topics/photo_integrity_check
    Please feel free to add your vote.

  • Endeca data export - Single catalog only

    With ATG 10.1.2, is there a way to configure the Endeca data export / Baseline Index process (as run from the ProductCatalogSimpleIndexingAdmin component in dyn admin) to generate data for products and categories in a particular catalog only, instead of for all catalogs?
    Thanks!
    -Jon
    Edited by: Jon Schneider on Apr 9, 2013 5:43 AM

    cmore, thank you, that's very helpful!
    I was looking into this question as a possible solution to an issue where when running the baseline import process as part of the ATG-Endeca integration to upload data from ATG to Endeca, catalog data was not being generated properly, and I was suspicious that part of the problem might be that my application has multiple catalogs.
    The problem turned out to be due to a missing default catalog configuration for my application's site, as described in my posts in this thread: Re: Baseline Index error (invoking from ProductCatalogSimpleIndexingAdmin)
    Since CategoryToDimensionOutputConfig doesn't have a value for the repositoryItemGroup property by default, and it looks like it would be a bit of work to get that set up properly and I no longer have a pressing need to do that, I'm setting this task aside for the time being. cmore, I appreciate your help regardless, though, since this information may be useful to myself or to others in the future. (I'll go ahead and mark your post as the solution.)
    Thanks,
    -Jon

  • Payroll  Idocs- Master Data Export - US( PC00_M10_OTEM )

    Hi,
    I have one issue with the IDocs created by Master Data Export- US Program.
    Program Name : RPCEMDU0_CALL
    Tcode : PC00_M10_OTEM
    The Idoc's which are being created by this program having some wrong values for Benefit Infotypes.
    Can anyone help me to know how the Idoc's are created and where the Idoc segments are getting filled with the data in this Program?
    I have debugged the program but did't get any clues.
    Thanks in Advance.
    Regards,
    Sudhakar.

    Hello,
    I have some helpful information.
    The program that creates the IDOC is RPCOTM** where ** is your country version.
    The results of the OTSEL feature are checked in RPCOMF** and then the routine FILL_DATES(RPCOTFX0) is called to write the IDOC. 
    Debugging in this area will tell you...
    1.  Is the OTSEL value for IT0041 set
    2.  Was there a change in the dates for your period
    3.  What is being written to the IDOC
    Best of Luck
    Dan Reeves

  • Payroll - Master Data Export - US( PC00_M10_OTEM )

    Hi,
    I have one issue with the IDocs created by Master Data Export- US Program.
    Program Name : RPCEMDU0_CALL
    Tcode : PC00_M10_OTEM
    The Idoc's which are being created by this program having some wrong values for Benefit Infotypes.
    Can anyone help me to know how the Idoc's are created and where the Idoc segments are getting filled with the data in this Program?
    I have debugged the program but did't get any clues. 
    Thanks in Advance.
    Regards,
    Sudhakar.

    Hello,
    I have some helpful information.
    The program that creates the IDOC is RPCOTM** where ** is your country version.
    The results of the OTSEL feature are checked in RPCOMF** and then the routine FILL_DATES(RPCOTFX0) is called to write the IDOC. 
    Debugging in this area will tell you...
    1.  Is the OTSEL value for IT0041 set
    2.  Was there a change in the dates for your period
    3.  What is being written to the IDOC
    Best of Luck
    Dan Reeves

  • Generating Master Data Export DataSources Error

    Hi Guys
       I 0VENDOR info object to be exported to other BI system . Iam using Generating Master Data Export DataSources. When I right click on 0vendor and say Generate export data source it shoud create 80VENDORM , 80VENDORT. But when I am doing that its saying generation successfull (what ever the success message)..when I go check the Infosources / Data Soruces / Info Provder and try to search 8zvendorM...I couldnt find this...Also I searched with vendor in all the tabs...Iam only finding 0VENDOR but not the new created once..Can any one tell me where should i check for it...Also ,please note that...Iam also using the same function Generate Export DS for one DSO and I can see this one in the Unassigned nodes of the Info Source TAb......
    Please let me know the sol ASAP
    regards

    Hi PB and Voodi
    I tried with Voodi's Sol and am still not able to find the 80VENDORM and 80VENDORT ....Where as PB's...I cant find the generated objects.in the settings tab...Can you please let me know what will be the issue and how to find this Data Sources....
    Please let me know ASAP...regards

  • Zenworks Inventory Data Export

    I am trying to get a report of our current systems using the Zenworks
    Inventory Data Export as shown in edir.
    When I run the export it also shows systems we have deleted. I am
    running
    zfd 4.01 sp2.
    Is there a way to get a report on only the systems that are in edir or
    filter it for only current systems?
    Thanks,
    Paul

    paul,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • What is your recommendation to improve data export time - MaxL?

    We have scheduled data export for backup everyday using MaxL and takes 2 hours and would like to decrease it to 30 min or less.
    What is your recommendation?
    1. Hot back-up
    2. Parallel Calc (if possible)
    3. Copy DB files on Unix box.
    4. ???
    Environment 9.3.0.1
    Thanks in advance!
    Edited by: venuramini on Jul 10, 2009 2:52 PM

    CL,
    I have done the following and my results are very encouraging.
    With 1 ExportThreads:
    0 lvl     9.40 min     2.8 GB
    All lvl     123.00 min     2.8 GB
    With 2 ExportThreads:               
    0 lvl     4.55 min     2.8 GB
    All lvl     72.78 min     2.8 GB
    With 3 ExportThreads:               
    0 lvl     3.32 min     2.8 GB
    All lvl     50.5 min     2.8 GB
    With 4 ExportThreads:               
    0 lvl     2.56 min     2.8 GB
    All lvl     36.8 min     2.8 GB
    Only issue with exporting 0 level data is aggregation. But it is a small inconvenience than taking long to export data that we very rarely need.
    Venu

Maybe you are looking for