Automating the creation/deletion of a workspace

Hi,
Can the .SQL file generated by exporting a workspace be used to 're-create' that workspace outside of using the Apex Administration tool. Also, is it possible to generate a .SQL file that can be used to DROP a workspace ( again outside of using APEX ADMIn),
thanks,
Kevin.

Hi Prabodh,
I've run into a problem with re-creating a workspace using the .SQL file. I'm executing the file from SQL DEVELOPER using the sys account and I'm getting the following error:
'Error Starting in line 88 in command:
begin
-- This date identifies the minimum version required to import this file.
wwv_flow_team_api.check_version(p_version_yyyy_mm_dd=>'2010.05.13');
end;
Error Report:
pls-00201:identifier 'WWV_FLOW_TEAM_API.CHECK_VERSION' must be declared
The script has successfully passed 'calls' to wwv_flow_api and www.flow_fnd_user_api so I don't know why it should not recognise wwv_flow_team_api.
The unrecognised package exists on the database and is owned by the account APEX_040000. However the account APEX_030200 also exists on the database
and I am wondering if this may have something to do with the problem. Any thoughts on this would be gratefully received,
regards,
KEvin.

Similar Messages

  • Automating the creation of a HDinsight cluster

    Hi,
    I am trying to automate the creation of a HDinsight cluster using Azure Automation to execute a powershell script (the script from the automation gallery). When I try and run this (even without populating any defaults), it errors with the following error:
    "Runbook definition is invalid. In a Windows PowerShell Workflow, parameter defaults may only be simple value types (such as integers) and strings. In addition, the type of the default value must match the type of the parameter."
    The script I am trying to run is:
    <#
     This PowerShell script was automatically converted to PowerShell Workflow so it can be run as a runbook.
     Specific changes that have been made are marked with a comment starting with “Converter:”
    #>
    <#
    .SYNOPSIS
      Creates a cluster with specified configuration.
    .DESCRIPTION
      Creates a HDInsight cluster configured with one storage account and default metastores. If storage account or container are not specified they are created
      automatically under the same name as the one provided for cluster. If ClusterSize is not specified it defaults to create small cluster with 2 nodes.
      User is prompted for credentials to use to provision the cluster.
      During the provisioning operation which usually takes around 15 minutes the script monitors status and reports when cluster is transitioning through the
      provisioning states.
    .EXAMPLE
      .\New-HDInsightCluster.ps1 -Cluster "MyClusterName" -Location "North Europe"
      .\New-HDInsightCluster.ps1 -Cluster "MyClusterName" -Location "North Europe"  `
          -DefaultStorageAccount mystorage -DefaultStorageContainer myContainer `
          -ClusterSizeInNodes 4
    #>
    workflow New-HDInsightCluster99 {
     param (
         # Cluster dns name to create
         [Parameter(Mandatory = $true)]
         [String]$Cluster,
         # Location
         [Parameter(Mandatory = $true)]
         [String]$Location = "North Europe",
         # Blob storage account that new cluster will be connected to
         [Parameter(Mandatory = $false)]
         [String]$DefaultStorageAccount = "tavidon",
         # Blob storage container that new cluster will use by default
         [Parameter(Mandatory = $false)]
         [String]$DefaultStorageContainer = "patientdata",
         # Number of data nodes that will be provisioned in the new cluster
         [Parameter(Mandatory = $false)]
         [Int32]$ClusterSizeInNodes = 2,
         # Credentials to be used for the new cluster
         [Parameter(Mandatory = $false)]
         [PSCredential]$Credential = $null
     # Converter: Wrapping initial script in an InlineScript activity, and passing any parameters for use within the InlineScript
     # Converter: If you want this InlineScript to execute on another host rather than the Automation worker, simply add some combination of -PSComputerName, -PSCredential, -PSConnectionURI, or other workflow common parameters as parameters of
    the InlineScript
     inlineScript {
      $Cluster = $using:Cluster
      $Location = $using:Location
      $DefaultStorageAccount = $using:DefaultStorageAccount
      $DefaultStorageContainer = $using:DefaultStorageContainer
      $ClusterSizeInNodes = $using:ClusterSizeInNodes
      $Credential = $using:Credential
      # The script has been tested on Powershell 3.0
      Set-StrictMode -Version 3
      # Following modifies the Write-Verbose behavior to turn the messages on globally for this session
      $VerbosePreference = "Continue"
      # Check if Windows Azure Powershell is avaiable
      if ((Get-Module -ListAvailable Azure) -eq $null)
          throw "Windows Azure Powershell not found! Please make sure to install them from 
      # Create storage account and container if not specified
      if ($DefaultStorageAccount -eq "") {
          $DefaultStorageAccount = $Cluster.ToLowerInvariant()
          # Check if account already exists then use it
          $storageAccount = Get-AzureStorageAccount -StorageAccountName $DefaultStorageAccount -ErrorAction SilentlyContinue
          if ($storageAccount -eq $null) {
              Write-Verbose "Creating new storage account $DefaultStorageAccount."
              $storageAccount = New-AzureStorageAccount –StorageAccountName $DefaultStorageAccount -Location $Location
          } else {
              Write-Verbose "Using existing storage account $DefaultStorageAccount."
      # Check if container already exists then use it
      if ($DefaultStorageContainer -eq "") {
          $storageContext = New-AzureStorageContext –StorageAccountName $DefaultStorageAccount -StorageAccountKey (Get-AzureStorageKey $DefaultStorageAccount).Primary
          $DefaultStorageContainer = $DefaultStorageAccount
          $storageContainer = Get-AzureStorageContainer -Name $DefaultStorageContainer -Context $storageContext -ErrorAction SilentlyContinue
          if ($storageContainer -eq $null) {
              Write-Verbose "Creating new storage container $DefaultStorageContainer."
              $storageContainer = New-AzureStorageContainer -Name $DefaultStorageContainer -Context $storageContext
          } else {
              Write-Verbose "Using existing storage container $DefaultStorageContainer."
      if ($Credential -eq $null) {
          # Get user credentials to use when provisioning the cluster.
          Write-Verbose "Prompt user for administrator credentials to use when provisioning the cluster."
          $Credential = Get-Credential
          Write-Verbose "Administrator credentials captured.  Use these credentials to login to the cluster when the script is complete."
      # Initiate cluster provisioning
      $storage = Get-AzureStorageAccount $DefaultStorageAccount
      New-AzureHDInsightCluster -Name $Cluster -Location $Location `
            -DefaultStorageAccountName ($storage.StorageAccountName + ".blob.core.windows.net") `
            -DefaultStorageAccountKey (Get-AzureStorageKey $DefaultStorageAccount).Primary `
            -DefaultStorageContainerName $DefaultStorageContainer `
            -Credential $Credential `
            -ClusterSizeInNodes $ClusterSizeInNodes
    Many thanks
    Brett

    Hi,
    it appears that [PSCredential]$Credential = $null is not correct, i to get the same
    error, let me check further on it and revert back to you.
    Best,
    Amar

  • Automating the creation of telephone accounts in call manager

    Hi, I have recently been asked to explore the possibility of automating telephone account creation in Cisco Call Manager, using scripting.  Now although my scripting knowledge isn't great, one idea that was suggested was the use of .csv files?  Any advice or tips about how to proceed would be very much appreciated. Thanks.

    Hi Jaime, thanks for your reply.  I have already discussed the possibility of using CUCM 10 although unfortunately this was ruled out straight away as the cost of implementing the latest version would be too expensive for our department.  I am also looking into the Cisco BAT and have an account for the CBT Nuggets to get some training on BAT.  Additionally, I am looking into writing up a simple script that could be used to change the formatting of a .csv file containing staff details that could then be uploaded to CUCM.

  • Automating Account Creation/Deletion

    Hi All,
    I'm trying to find an easy way to manage (create/delete) user accounts in a lab full of computers using Apple Remote Desktop (ARD). I was thinking that sending shell commands might work, but I can't seem to find a command or script that creates/destroys user accounts. Is there such a command? Or is there a better way to do it?

    Check email templates for all the other emails.
    rgds,
    Suren

  • Automating the Creation Process for 1 video for 1000s?

    Here's what I want to do, but I don't know if it's possible:
    For my company, I have designed a fairly simple motion graphics type video for various products that we sell online. It involves the image of the project, some descriptions of it and a title, with some simple movement and transitions with music underneath. The problem though is that we have 1000s of products. Is there a way to create, buy, or have written a script that will be able to take the template project that i've created and use it to do a similar thing with the 1000s of other products (over a long time, presumably)? Just replace the image and text with text and image in a predetermined location.
    Does this make sense? Is it possible? If so, how?
    If not, what other approaches would you guys suggest to approaching this problem?
    Thanks a lot.
    William

    hi william!
    in short: yes.
    what you're asking for is very much possible, and actually not too complicated to do.
    i would recommend using AE's javaScript interface and not the C++ API.
    it's one hell of a lot easier to develop and also to adjust, and it has all the facilities you need.
    in essence:
    1. you supply a file with a list of the data you want to input to each product. (i.e. file names, text, ect...)
    2. the script parses that file and iterates though the entries, each time importing the base project and replacing the footage and text.
    3. either render each project and move to the next, or create a big project containing any number of the entries and render it all together.
    4. tada!

  • The creation of the new workspace has been partially successful

    Hi,
    I tried to create a Workspace called WSMedicina and the following error apears:
    The creation of the new workspace has been partially successful.
    Exception Message: Cannot find library item WsMedicina.
    The error cause: Cannot add Library Service.
    Please help me.
    Thanks.
    James.

    Hello James,
    What error appears in the application.log file for Workspaces when the creation fails? You can find this log at a path like this:
    $ORACLE_HOME/j2ee/OC4J_OCSClient/application-deployments/workspaces/OC4J_Workspaces_default_island_1
    Has creation of a library during new workspace creation worked in the past?
    Can the same user access the Content Services application successfully?
    regards,
    -Neil.

  • Run cleanup utility, then all ivi drivers in the measurement & automation explorer are deleted,

    Hello I have the following problem,
    I run the cleanup utility, then all ivi drivers in the measurement & automation explorer are deleted.
    when I want to add for example a new driver session then that is not possible to click on "create new".
     What is the problem?
    regards samuel

    Cedric,
    In this case its the usb-serial adapter that is likely the cause of the issue.  As mentioned in this KB it is likely that MAX doesn't know how to recognize the third-party adapter.  You may want to refer to this KB that talks about making VISA calls to third-party devices as it may be useful.
    John B.
    Applications Engineer
    National Instruments

  • Strock transfer process - Automating the document creation/flow

    Hi,
    I am working on an implementation of ECC6 where the client has a requirement to automate creation of many of the system documents.
    We are following the standard SAP BBP J51: Internal Procurement (Stock Transfer With Delivery).  To summarise the BPP, the following process is followed:
    Stock Transfer (Purchase) order is created -> Outbound delivery is created with ref to the PO -> Picking/Goods Issue is done by the sending plant against the Outbound Delivery -> Inbound delivery is created -> Goods are Goods Receipted into the receiving plant.
    Our client has a requirement that the process is done as follows:
    - PO/Stock Transfer (ST) is manually created when a ST is required.
    - On Saving of the ST, the Outbound Delivery is automatically generated by the system with reference to the PO/ST.
    - The Picking is done and the Goods Issued. 
    - On saving the Goods Issue, the inbound Delivery is created with reference to the outbound delivery or PO/ST (or creation can be done at the same time as the creation of the Outbound Delivery).
    - Goods Receipt is then manually completed by the receiving plant and the process is complete.
    After searching various forums, I have found some people say it can only be done through ABAP customisation and and others who say that from version ECC6 onwards it can be done via SPRO config.  The instructions I have found on how to do this via Config were unclear and I was unable to get the process working.  I have been unable to find any information on this through help.sap.com or OSS notes.
    If anyone is able to provide the steps required to set this up or a link to reference documents i can use to find the answer, it would be much appreciated. Please advise if you need more info.
    Thanks

    Hello Abhishek.
    Please see the following information on workflows available.
    http://help.sap.com/saphelp_utilities472/helpdata/en/38/1a6c35a018d041e10000009b38f839/frameset.htm
    http://help.sap.com/saphelp_utilities472/helpdata/en/38/1a6c35a018d041e10000009b38f839/frameset.htm
    Also see object DISCONNECT in transaction SWO1 Disconnection.Create...
    I hope I have understood your request and this information is helpful.
    Regards
    Olivia

  • How to protect the creation of a db across multiple threads/processes?

    Given a multi-process, multi-threaded application, and one database file to be created, how can I guarantee that only one of the threads in one of the processes successfully creates the database, when ALL of the threads are going to either attempt to create it, or open it (if it already exists) upon startup?
    My current logic for all threads is:
    set open flags to DB_THREAD
    start transaction
    attempt to open the db
    if ENOENT
    abort transaction
    change open flags to DB_CREATE | DB_EXCL | DB_THREAD
    retry
    else if EEXIST
    abort transaction
    change open flags to DB_THREAD
    retry
    else if !ok
    # some other error
    end
    commit transaction
    I'm testing on Linux right now, with plans to move to Windows, AIX, and Solaris. What I'm experiencing on Linux is several of the threads (out of 10 threads I'm testing) will succeed in creating the database. The others will receive either either succeed in opening the first time through, or will receive the EEXIST when they do the open w/ create flags - ultimately, they open the same created db (I'm presuming the last one that's created by one of the other threads). Effectively, the open with DB_CREATE | DB_EXCL is not ensuring that only one DB is created. I was under the impression that opening in a transaction would guarantee this, but it does not, or maybe I'm doing something incorrectly?
    Should DB_CREATE | DB_EXCL and opening in a transaction guarantee that only one thread can create the database? Do I need to use another synchronization method?
    Note: I am running off of a local disk, not over NFS or anything like that.
    I tried taking out my transaction and using DB_AUTO_COMMIT instead, still no go - multiple threads still report they successfully created the DB. Using BDB 4.5.
    Thanks,
    Kevin Burge

    Brian,
    Thanks for the reply. I think I'm doing what you said, unless I'm misunderstanding. I do have all threads try to do the DB_CREATE | DB_EXCL. Are you saying I shouldn't use the DB_EXCL flag?
    The problem I was seeing with 10 threads calling open w/ DB_CREATE | DB_EXCL on the same db:
    * Between 1 and 9 threads would return success from db->open with the creation flags.... but the last one to create "wins".
    * All the other threads would get EEXIST, as expected.
    The threads that "lost", do get a successful return code from "open" with the create flags, but all data written to them is lost. They act normally except for the fact that the have a deleted file-handle that they are writing to. There's no indicator that records written to them are going into the void.
    My test:
    I had 10 threads each trying to create or open a recno db, then append 10 records, for a total of 100 records expected. Ultimately, I would end up with between 20 to 100 records in the db, depending on how many of the threads said they successfully created the db. So, if 5 threads said they created the db successfully, then 40 records would be missing, because 4 of those threads were writing to deleted file handles. If 2 threads said they created the db, then 10 records would be missing....If 8 threads, then 70 records missing, etc.
    In other words, multiple threads creating the db appears to work correctly, because there are no errors. It was the missing records that caught my attention, and prompted my question on this forum.
    For what it's worth, I've worked around the problem by opening a similarly named file via the open() system call, with O_CREAT|O_EXCL, which is guaranteed to be atomic. The first thread that can create this temp file is the only thread that can actually create the db - all others sleep upon open with ENOENT until it's created.
    Thanks,
    Kevin

  • Automating album creation for a new project?

    I have several projects (such as specific event shoots) with the same basic set of albums for that project (some regular albums, some smart albums). I'll have albums for specific subsets of the events (for my sports shots, there's Individuals, Action, Team, etc.).
    I'd like to automate the creation of these albums, as it gets a bit tedious having to create a bunch of albums manually for each new project I create. I looked at Aperture's automator actions, but couldn't find anything related to album creation. Is there such a thing?
    Thanks...
    David

    I'm pretty sure this is scriptable using AppleScript. I haven't really had the need to script Aperture so I can't tell you the exact syntax without looking it up myself, but it shouldn't be too hard if you've ever used AppleScript. Just open Script Editor, go to File > Open Dictionary... > Aperture and you'll get a list of the scriptable actions Aperture has available.

  • Limit on deleting members from workspace

    Hello,
    I wanted to know if there is a limit on how many members that can be deleted at once from a dimension in a classic planning application. I know that there is a parameter in the outline load utility that would delete the entire dimension and would load the members from scratch from the metadata file you specify. But is there was a better way of deleting the required members apart from using the outline load utility parameter and deleting from the workspace?
    We are trying to cleanup our metadata and also trying to delete some members from a dimension but it looks like it is taking ~ 40 mins for deleting ~ 3000 records. Does this sound reasonable?
    Also, I am seeing some strange and unexpected behavior while performing these operations. When I am selecting a parent member (that has ~ 3000 child members) and hitting delete, it would show like it's doing something in the status bar on the browser but after a while it would disappear and the page doesnt refresh. It is neither showing the success message nor is it showing any error which is confusing as it's not telling me if it did the delete or if it didn't do the delete. So, when this happened I went and checked the dimension and the parent level member was still in there without getting deleted. Then I tried deleting a different parent member and then encountered an error after a while saying -
    An error occured while processing the page. Please check the logs for details.* (Can someone please let me know what's the name of the log file that I need to look and where?)
    Another time it threw an error -
    You are trying to change data that has been changed by a user on another server. Wait a few seconds and try again. If you continue to see this error message please contact your administrator.*
    So basically I am seeing all sorts of different things happening and the deletion process is not going smoothly.
    Please share your thoughts on this.
    Thanks.
    ~ Adella

    Depeding upon your version the log file location will vary.
    in 11.1.2.x version you can check the logs under
    <epm drive>:\Oracle\Middleware\user_projects\epmsystem1\diagnostics\logs\services\HyS9Planning-sysout.log
    <epm drive>:\Oracle\Middleware\user_projects\epmsystem1\diagnostics\logs\services\HyS9Planning-syserr.log
    <epm drive>:\Oracle\Middleware\user_projects\domains\EPMSystem\servers\Planning0\logs\Planning0.log
    you can also mention an operation with outline load to delete (update which will update the members, delete level 0, delete idescendants, delete descendants)
    Regards
    Celvin
    http://www.orahyplabs.com
    Please mark the responses as helpful/correct if applicable

  • Can I automate the creation of a cluster in LabView using the data structure created in an autogenerated .CSV, C header, or XML file?

    Can I automate the creation of a cluster in LabView using the data structure created in an auto generated .CSV, C header, or XML file?  I'm trying to take the data structure defined in one or more of those files listed and have LabView automatically create a cluster with identical structure and data types.  (Ideally, I would like to do this with a C header file only.)  Basically, I'm trying to avoid having to create the cluster by hand, as the number of cluster elements could be very large. I've looked into EasyXML and contacted the rep for the add-on.  Unfortunately, this capability has not been created yet.  Has anyone done something like this before? Thanks in advance for the help.  
    Message Edited by PhilipJoeP on 04-29-2009 04:54 PM
    Solved!
    Go to Solution.

    smercurio_fc wrote:
    Is this something you're trying to do at runtime? Clusters are fixed data structures so you can't change them programmatically. Or, are you just trying to create some typedef cluster controls so that you can use them for coding? What would your clusters basically look like? Perhaps another way of holding the information like an array of variants?
    You can try LabVIEW scripting, though be aware that this is not supported by NI. 
     Wow!  Thanks for the quick response!  We would use this cluster as a fixed data structure.  No need to change the structure during runtime.  The cluster would be a cluster of clusters with multiple levels.  There would be not pattern as to how deep these levels would go, or how many elements would be in each.   Here is the application.  I would like to be able to autocode a Simulink model file into a DLL.  The model DLL would accept a Simulink bus object of a certain data structure (bus of buses), pick out which elements of the bus is needed for the model calculation, and then pass the bus object.  I then will take the DLL file and use the DLL VI block to pass a cluster into the DLL block (with identical structure as the bus in Simulink).  To save time, I would like to auto generate the C header file using Simulink to define the bus structure and then have LabView read that header file and create the cluster automatically.   Right now I can do everything but the auto creation of the cluster.  I can manually build the cluster to match the Simulink model bus structure and it runs fine.  But this is only for an example model with a small structure.  Need to make the cluster creation automated so it can handle large structures with minimal brute force. Thanks!  

  • How to stop the creation of /home on reboot?

    I assume it's systemd creating /home since I never had this situation with arch init scripts.   Maybe it's something else?
    I will delete /home and upon reboot it will return.
    I do not use /home and do not like standard user hierarchy and am tired of looking at the empty folder every startup.
    If this is indeed systemd (I'm not too familiar with it yet since I only updated to it about a month ago), does anyone know the related .service or .target (etc) doing this?  I tried searching for /home inside the files but so far no luck.  I did 'grep' and there are binary matches but not much I can do about that.  Maybe I'm looking in the wrong place.  /usr/lib/systemd, /etc/systemd/, /usr/lib/udev
    Feels like the appropriate section to post this.  I'm rarely online so was hoping to throw this out there and hope to get an answer in the right direction.  Thanks.
    EDIT:  Sorry if a similar thread exists.. I couldn't find any as it has rather vague keywords.
    Last edited by milomouse (2013-01-19 21:42:45)

    Yeah, the /home folder is empty because I do not use it and no users $HOME are pointing there.  I also have my XDG settings pointing elsewhere, so that can't be it.
    When I get home I'll try modifying the useradd file but not sure if that will do it.  Although, I wonder if it's reading from another system login type file that defaults to /home..
    or perhaps it's just creating a standard filesystem hierarchy.  Hmm!!
    I would resort to deleting it with a script at boot-up but since I don't use /home and it doesn't have a separate partition it writes to my / partition which is a Solid State Drive.
    I know having it created and deleted at each boot may seem trivial but I like to avoid unnecessary writes/deletes on the SSD no matter how insignificant it may seem.
    Also, I just really want to find out where this is coming from.  I do my best to know what my setup is doing and hate not knowing why something keeps happening. 
    So, I'm still probing to where the creation stems from.  I wish there was some system monitor or journalctl would tell me more specifically what systemd is doing.  Maybe I'll delete it and check timestamps and match against systemd service files being ran?  May take some time but at least it's something to try.

  • Need of User-Exit  in the creation of Sales Order(VA01)

    Hi,
    In the creation of Sales Order, I need to Compare the Ordered Quantity and Confirmed Quantity. If the CQ is less than OQ I need to create one more line item with the same material for the rest of the quantity and send the request to Production order for the remaining quantity. For this I am unable to get the exact exit. Please help me in this regard ASAP.
    Thank you.

    the following program are the user exit for billing.
    we often use RV60AFZC and RV60AFZZ.
    RV60AFZA
    RV60AFZB
    RV60AFZC
    RV60AFZD
    RV60AFZZ
    RV60BFZA
    For Sales order
    Pricing, item addtion deletion
    MV45AFZZ
    First, I did not find documentation for the BADI either. But at the first glance the process of implementing it looks quite straightforward. I assume you run R/3 Enterprise (4.7). So, you should implement BADI 'BADI_SD_SALES' - this must be done in transaction SE19. In particular for the purpose of adding some additional items into sales document I would implement method SAVE_DOCUMENT_PREPARE. This method has changing table parameter FXVBAP of type VA_VBAPVB_T - it holds all the sales document items. Just add items of yours to it. Certainly, you have to fill all the appropriate fields carefully.
    Hope this helps somehow.
    In that case you should use USEREXIT_DOCUMENT_SAVE_PREPARE subroutine (form). As far as I remember it has no parameters. To add items to the sales document you should modify internal table XVBAP.
    regards
    vinod

  • List of all the process types used in the creation of process chains

    Hi Gurus,
    I am new to process chains creation, can any one give the list of all the process types used in the creation of process chains and there uses also.
    Please search the forum before posting a thread
    Edited by: Pravender on Jan 4, 2011 4:18 PM

    Hi,
    As process chain is to automate the data load step.
    For Ex:- If you are loading data from R/3 to DSO CUBE.
    Here you require Info package to load till PSA.
                                 DTP to load till DSO
                                  Another DTP to load Cube.
    And all other process like psa deletion AND process you can find in RSPC tab.
    Thnaks,
    Saveen Kumar

Maybe you are looking for

  • I should have a credit on my iTunes to pay for a pass download. How do I find where it's at

    I should have a credit to pay fora pass purchase but don't know where to find it!!!

  • Is there any way I can get a wireless N router?

    Okay so I have the old 25/25 connection and with the current modem/router from verizon I got bout 12 down and like 5 up. I switched to a 3rd party router a long time ago but I wish to use a verizon modem/router that has a wireless N router since the

  • Fault policy and jca.retry.count in 11g

    I have a composite which has the architecture as follows : Adapter (File) -- Mediator -- Adapter (FTP) The mediator follows sequential routing pattern I have configured jta.retry.count = 3 for the FTP adapter. Also the composite refers to the fault p

  • No photos in Facebook

    For the past 48-72 hours I have been unable to access any photos in Facebook. This includes all of my own photos; whether it be profile shots or photos in my albums. If I click on a thumbnail photo I am taken to another page but when the loading icon

  • Error  "EXPORT_INCONSISTENT_STRUCTURE"  IN PR (ME52N)

    Hello Friends, I am getting runtime error EXPORT_INCONSISTENT_STRUCTURE while saving Purchase requisition(PR). Below is the details displayed in runtime time error description. short text: INTERNAL ERROR:INVALID STRUCTURE DESCRIPTION ERROR ANALYSIS: