Any Export Command Working?

I see in the README a reference to exporting DDL not working. I get an error on any of the menu items - ORA-942 and then a null pointer exception:
Running against Oracle10g xe. Raptor looks good so far as a toad/nav replacement. I'm suprised at how quick it is.
java.lang.NullPointerException
     at oracle.dbtools.raptor.dialogs.export.ColumnPanel.addColumnsToTree(ColumnPanel.java:72)
     at oracle.dbtools.raptor.dialogs.export.ColumnPanel.<init>(ColumnPanel.java:48)
     at oracle.dbtools.raptor.dialogs.actions.TableExportAction.launch(TableExportAction.java:44)
     at oracle.dbtools.raptor.dialogs.BasicObjectModifier.launch(BasicObjectModifier.java:145)
     at oracle.dbtools.raptor.dialogs.BasicObjectModifier.handleEvent(BasicObjectModifier.java:210)
     at oracle.dbtools.raptor.dialogs.actions.XMLBasedObjectAction$DefaultController.handleEvent(XMLBasedObjectAction.java:265)
     at oracle.ide.controller.IdeAction.performAction(IdeAction.java:530)
     at oracle.ide.controller.IdeAction$1.run(IdeAction.java:785)
     at oracle.ide.controller.IdeAction.actionPerformedImpl(IdeAction.java:804)
     at oracle.ide.controller.IdeAction.actionPerformed(IdeAction.java:499)
     at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
     at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
     at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
     at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
     at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
     at javax.swing.plaf.basic.BasicMenuItemUI.doClick(BasicMenuItemUI.java:1000)
     at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(BasicMenuItemUI.java:1041)
     at java.awt.Component.processMouseEvent(Component.java:5488)
     at javax.swing.JComponent.processMouseEvent(JComponent.java:3126)
     at java.awt.Component.processEvent(Component.java:5253)
     at java.awt.Container.processEvent(Container.java:1966)
     at java.awt.Component.dispatchEventImpl(Component.java:3955)
     at java.awt.Container.dispatchEventImpl(Container.java:2024)
     at java.awt.Component.dispatchEvent(Component.java:3803)
     at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4212)
     at java.awt.LightweightDispatcher.processMouseEvent(Container.java:3892)
     at java.awt.LightweightDispatcher.dispatchEvent(Container.java:3822)
     at java.awt.Container.dispatchEventImpl(Container.java:2010)
     at java.awt.Window.dispatchEventImpl(Window.java:1774)
     at java.awt.Component.dispatchEvent(Component.java:3803)
     at java.awt.EventQueue.dispatchEvent(EventQueue.java:463)
     at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
     at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
     at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
     at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
     at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)

This is known to be not working. In the mean time, you can look at the SQL tab of the object to get the sql.
-kris

Similar Messages

  • Command "service-policy input policy-name permit-any" will not work

    Hi all,
    have a SG500 with latest Firmware, but this command will not work.
    service-policy input QoS_01 permit-any
    i get this error message:
    % Wrong number of parameters or invalid range, size or characters entered
    without the option "permit-any or deny-any" the command is successfully.
    What is the reason?
    It is important, directly to specify this options. Otherwise to lose the access to the switch.
    Regards
    Stefan

    Hi Tom,
    i have a ACL / ACE and create a QoS "policy table" put the "policy class map" (with class mappings) in it.
    And now i will bind this QoS policy to a Ethernet port.
    cli tutorial example say:
    Use the service-policy Interface Configuration (Ethernet, Port-channel) mode command to bind a policy map to a port/port-channel. Use the no form of this command to detach a policy map from an interface.
    This command is only available in QoS advanced mode.
    Syntax
    service-policy input policy-map-name default-action [permit-any | deny-any]
    no service-policy input
    Example:
    witchxxxxxx(config-if)# service-policy input policy1 permit-any
    A cisco support open a ticket for me.
    -Stefan

  • How can I get the attributes details like user name, mail , from sAMAccount csv or notepad file through powershell or any other command in AD?

    How can I get the attributes details like user name, mail , from sAMAccount csv or notepad file through powershell or any other command in AD?

    Ok what about If i need to get all important attributes by comparing Email addresses from excel file and get all required answers
    currently I am trying to verify how many users Lines are missing , Emp numbers , Phones  from AD with HR list available to me.
    I am trying to Scan all the AD matching HR Excel sheet and want to search quickly how many accounts are active , Line Managers names , Phone numbers , locations , title , AD ID .
    these are fields I am interested to get in output file after scanning Excel file and geting reply from AD in another Excel or CSV file
    Name’tAccountName’tDescri ption’tEma I IAddress’tLastLogonoate’tManager’tTitle’tDepartmenttComp
    any’twhenCreatedtAcctEnabled’tGroups
    Name,SamAccountName,Description,EmailAddress,LastLogonDate,Manager,Title,Department,Company,whenCreated,Enabled,MemberOf | Sort-Object -Property Name
    Can you modify this script to help me out :)
    Hi,
    Depending on what attributes you want.
    Import-Module ActiveDirectory
    #From a txt file
    $USERS = Get-Content C:\Temp\USER-LIST.txt
    $USERS|Foreach{Get-ADUser $_ -Properties * |Select SAMAccountName, mail, XXXXX}|Export-CSV -Path C:\Temp\USERS-ATTRIBUTES.csv
    #or from a csv file
    $USERS = Import-CSV C:\Temp\USER-LIST.csv
    $USERS|Foreach{Get-ADUser $_.SAMAccountName -Properties * |Select SAMAccountName, mail, XXXXX}|Export-CSV -Path C:\Temp\USERS-ATTRIBUTES.csv
    Regards,
    Dear
    Gautam Ji<abbr class="affil"></abbr>
    Thanks for replying I tried both but it did not work for me instead this command which i extended generated nice results
    Get-ADUser -Filter * -Property * | Select-Object Name,Created,createTimeStamp,DistinguishedName,DisplayName,
    EmployeeID,EmployeeNumber,Enabled,HomeDirectory,LastBadPasswordAttempt,LastLogonDate,LogonWorkstations,City,Manager,MemberOf,MobilePhone,PasswordLastSet,BadLogonCount,pwdLastSet,SamAccountName,UserPrincipalName,whenCreated,whenChanged
    | Export-CSV Allusers.csv -NoTypeInformation -Encoding UTF8
    only one problem is that Manager column is generating this outcome rather showing exact name of the line Manager .
    CN=Mr XYZ ,OU=Users,OU=IT,OU=Departments,OU=Company ,DC=organization,DC=com,DC=tk

  • Level0 Parallel export not working for BSO

    Hi,
    We have a BSO cube with 15 dimensions, 2 dense and remaining spare. Total number of members are around 9000.
    we have added 400 new members to one of the sparse dimension after which parallel export is not happening. I started checking by deleting one by one and exporting it. when I deleted 8 members, export happened fine. If I add one extra member after that, parallel export is not hapenning.
    Strange thing is If I add member as child at any generation, export is hapenning fine. If I add a member as sibbling starting from gen 3, export is not working.
    Ex: A is dimension, B is child of A. If I add a new member C as sibling to B, parallel export is working.
    Now add D as child of C and E as sibling of D. D here is third generation and adding a sibbling hereafter is making parallel export not to work
    I'm using simple command - export database 'APP'.'DB' level0 data in columns to data_file "'D:\exp.1.txt'",D:\exp.2.txt'";
    If I use single file, export is happening fine.
    Does anybody has idea if BSO has member limit at generating levels. Please let me know your thoughts on this.
    Thanks,
    Swetha
    Edited by: Swetha on Dec 16, 2010 3:16 AM

    Dave,
    I'm facing the same issue which you faced earlier. Export files are created only with header. Issue here is I'm not getting data in any of those files. All the files has just header. I changed the order of sparse dimensions, my last sparse dimension is the largest sparse with 2250 members. Still export is creating 1kb files with only headers.
    The reason to change sparse to dense is we have scenario dimension which has Actual, operating and replan members and data is loaded for all years and all periods for all scenarios. So we thought it can be made dense and we didn't observe much difference in behaviour. Export is also happening fine. One has to analyze their outline structure and data before going for this option.
    I would still prefer old outline and appreciate your help on this..
    Swetha
    Edited by: Swetha on Dec 23, 2010 12:51 AM

  • Export: command not found

    I have been trying to install something via MacPorts and was following a blog that described setting the PATH appropriately so that the port command would work, however when I updated .bash_profile as described in the blog and tried to source it, it gives me the error, "export: command not found."
    currently echo $PATH returns: /Library/Frameworks/Python.framework/Versions/2.7/bin:/usr/bin:/bin:/usr/sbin:/ sbin:/usr/local/bin:/usr/X11/bin
    Any thoughts on how to proceed?

    Please post the actual code.
    Also are you sure you are using the 'bash' shell (or sh, ksh, zsh shells)?  With all these shells 'export' is a built-in command and should never be "not found".
    Could you be using the csh or tcsh shell?  If you are using one of these shells, then setting environment variables is done using the 'setenv' command, which is also a csh/tcsh built-in.
    Use the following command to see what your default shell is:
    dscl . -read $HOME shell
    If you have csh or tcsh as your shell, then use something like the following to set your PATH environment variable:
    setenv PATH "$PATH:/the/stuff/you/are/adding"

  • Export command to back up data

    Dear All
    I am developing a system for a store system, and I want to use the export command to let the user back up the data , but I have those four questions:-
    1. Is using the export command a good practice for doing the back up process?
    2. At what time in the day the customer should do the back up for his system?
    3. let say he will do the back up at the end of each working day , then could he only back up the new data that have not been backed up before, or he always should back up all the data in the database regardless if this data were backed up before.
    4. And if the user do a back up for his work at the end of each day then what will happen if the system crashed during a working day, will he lose the data that have been updated and not backed up before?

    An export backs up the data, not the database. That's the fundamental difference between a 'logical' and a 'physical' export. Instead of copying files (from disk to disk or disk to tape), you are selecting out the data that has been previously inserted into the database.
    There's nothing wrong with that as such, but it doesn't scale: selecting out 100 million rows is going to take about 100 times as long as selection out just 1 million. That's why you heard that export was OK as a backup mechanism for small databases: for large databases, it's a non-starter, simply because it would take days to select all the data out (and even longer to insert it all back again). On the other hand, physical backups do scale to a certain extent, because a lot of data files are mostly empty space for a lot of the time. So a 100GB database could probably be storing 1 million rows or 10 million rows or even (at a pinch) 100 million rows: data files don't grow every time you insert a new row, in other words. So almost regardless of how much data you are storing in the database, the time taken to back it up isn't going to vary that much. (There will be periodic points, of course, where you fill all available space and make the database jump in size by an increment... but between two of those points, you can keep piling the records in and the database size won't change).
    So that's one thing: export doesn't scale.
    Second big thing is that once you start dealing with rows of data, you've completely lost the ability to recover things using Oracle's entirely physical recovery mechanism. That is, with a physical backup of a datafile made on Sunday night, and a record of all the changes done to data on Monday, Tuesday and Wednesday (called redo), if I lose a datafile on Thursday morning, I can restore that one file from Sunday's backup, replay all the data changes in the redo stream and get my entire database back up and running exactly at the point it had gotten to when the datafile was damaged. That's called a complete recovery and it is such a reliable and robust mechanism that you are guaranteed not to lose any committed data using it: you get everything back (so long as you keep all your redo since your last backup).
    But that guarantee only applies when you are taking physical copies of data files and storing redo in physical redo logs. If, instead, you are storing rows of data in export dump files, then that mechanism doesn't apply to you at all.
    So, if you export a big table on Sunday, do lots of transactions to it on Monday, Tuesday and Wednesday, and then accidentally drop or damage the table on Thursday, the best you can do is import the table from Sunday's export... and that is it. You cannot apply the Monday, Tuesday and Wednesday redo to that restored table data. You cannot roll the table forward in time. Therefore, you lose all the committed data that was entered into that table on those three days.
    Export is, in other words, a snapshot technology: you have replicas of your data as they were at some point in the past. You can always "restore" that snapshot, but that is as good as the recovery gets.
    That's a big issue. It means that if you care about your data at all, or at least value it enough not to want to lose any of it, you cannot rely on export as a way of backing it up. If export is your only backup mechanism, you will lose data. That's a promise, incidentally: it absolutely will happen one day. It's not a 'maybe', just a question of time.
    If you only did physical file copies, though, you would not lose data (again, always assuming you look after that stream of redo).
    On the other hand, there are times when you positively want to extract data out of one database and re-insert it into another -such as when you decide your database was created using the wrong block size, for example. In those circumstances, physical backups don't help (because they simply copy the 'wrong' block size around the place). But an export does exactly what's required: separating the data from the physical files its normally stored in, and so is very handy for that sort of situation. Similarly, if I need to preserve one table for audit purposes, copying the entire 16GB data file in which it's 1000 records are stored is not a very efficient way of doing that! The fact that export works on an object-by-object basis, rather than a file-by-file one, means it's perfect for that sort of data extraction job.
    Think of export, therefore, more as a system utility that happens to involve copying data. Yes, a copy can come in handy when disaster strikes and all else fails, but that's not its principle purpose. It might be sufficient functionality to provide the level of data protection you need, but it doesn't and cannot provide complete data protection, because it lacks that essential 'roll-forwardability' that physical backups (datafile copies) have.

  • Is there any way to work on one Catalog from two computers simultaenously?

    Is there any way to work on one Catalog from two computers simultaenously?
    I have a catalog with 7000 images we have to process / crop / etc. and I was trying to find a way that two of us could work on the images at the same time.
    Thanks!

    No, the LR catalog is not multi user capable (and cannot reside on a network share without tricks). What you could do is
    export part of the catalog (a subset of all images) into a separate catalog ("Export as Catalog", without exporting the negatives)
    work on different images in the two catalogs at the same time
    re-import ("Import from Catalog") the exported work after adjustments on it is finished
    Beat

  • Could not complete the export command because of an application error

    Every once in awhile I get the error, "Could not complete the export command because of an application error" when using the Save for Web option. It's really confusing because the error is not consistent. I don't get it all the time. In fact, 99% of the time I can save with no problems. When I quit Photoshop and reopen my files they seem to save just fine. Does anyone know why this happens, and how to prevent it from happening in the future?
    Here are my specs:
    Macintosh: iMac
    OS: 10.6.8
    Photoshop: CS3 (v10.0.1)
    Any help is greatly appreciated!

    No idea about the specific issue, but I like to pass on some general advice I have received myself here over time:
    Boilerplate-text:
    Are Photoshop and OS fully updated?
    As with all unexplainable Photoshop-problems you might try trashing the prefs (after making sure all customized presets like Actions, Patterns, Brushes etc. have been saved and making a note of the Preferences you’ve changed) by pressing command-alt-shift on starting the program or starting from a new user-account.
    System Maintenance (repairing permissions, purging PRAM, running cron-scripts, cleaning caches, etc.) might also be beneficial, Onyx has been recommended for such tasks.
    http://www.apple.com/downloads/macosx/system_disk_utilities/onyx.html
    Weeding out bad fonts never seems to be a bad idea, either. (Validate your fonts in Font Book and remove the bad ones.)
    If 3rd party plug-ins are installed try disabling them to verify if one of those may be responsible for the problem.

  • Can't get Export to work in iPhoto 6.0

    I can't get Export to work. I haven't been a big iPhoto user but just upgraded from ver. 2.0.1 to 6.0 four days ago. Never used export before; however, going through the '06 tutorial I have tried and failed several times (but dragging to the desktop works). I select a photo and File-->Export and the "Export Photos" dialog comes up. I hit the "File Export button at the top" per iPhoto Help. Nothing happens; the File Export pane remains blank. Hitting "Export" doesn't work either. Only the Cancel button works. I have a relatively small library: about 200 from before the upgrade and 100 imported using the new alias feature. Would appreciate any advice.
    PowerMac G4 QuickSilver   Mac OS X (10.4.4)   1.12 GB RAM

    Hi Walt,
    did you use an iPhoto plugin before? Maybe you have an outdated flickr plugin that needs to be updated or disabled.
    Do a "get info" on iPhoto.app and see if you have that installed.
    The plugins are listed at the bottom of the "get info" panel

  • Problem in export/import work repo !!!

    Hi All,
    Good Morning.
    I have a problem with exporting and importing repositories. The description of the problem is given below,
    I have a remote machine in that sunopsis is installed and all the developments,implementations,testing are happening there.
    Now i need to stimulate the same environment in my local desktop(ODI is installed and master,work repo is created).
    When i export the work repo from my remote machine and trying to import it in my local machine (from designer level import->work repo) after a while its saying "snp_lschema does not exists" error.
    Any one have idea why this is happening?
    Thanks,
    Guru

    Hi Julien,
    Thanks for your input. It really helpful for me.
    I need some more clarifications,
    Actually i exported my master and work repo from my remote machine (as a zip file) and saved it in my local drive (say D:/downloads/deploy.zip...)
    So when i tried to import master repo from topology ( browsing the zip file and said OK) its not doing anything, i mean nothing happens after i said OK.
    Should i have to copy this master repo zip in the Sunopsis installation folder (in IMPEXP directory) and then import it? Am i doing right?
    Please advise.
    Thanks,
    Guru

  • Any 2 buttons work, any 3 don't?

    Hi everyone,
    I'm finding some unintended results from buttons on my site, and there seems not to be a logic that I can just fix. The site is arcotecture.info if you wouldn't mind taking a look. On pages such as "(r)Evolve" and "undo," there is a vertical sequence of thumbnail buttons on the right that trigger a sliding image viewer to see them larger. Once the viewer is open, it should be possible to click on any of the other thumbnails and go right to the large image without moving the viewer at all. But for some reason, any two combinations work, and any 3 don't. Try viewing a 3rd once the viewer is open, and the slider closes, even though the script would never have told it to do that. The only way to make it close should be the minus sign button (called reverseButton). Also, if you click the same thumbnail again, sometimes (only sometimes?) the slider closes, which is wrong and not in the script.
    The script for once the "(r)Evolve" viewer is open looks like this: (and the thumbnail buttons are named revolve1Button to revolve5Button from top to bottom):
    stop();
    function reverserevolve5 (event:MouseEvent):void
         gotoAndPlay(1221);
    reverseButton.addEventListener(MouseEvent.CLICK, reverserevolve5);
    function gotorevolve2fromrevolve5 (event:MouseEvent):void
         gotoAndPlay(923);
    revolve2Button.addEventListener(MouseEvent.CLICK, gotorevolve2fromrevolve5);
    function gotorevolve3fromrevolve5 (event:MouseEvent):void
         gotoAndPlay(1022);
    revolve3Button.addEventListener(MouseEvent.CLICK, gotorevolve3fromrevolve5);
    function gotorevolve4fromrevolve5 (event:MouseEvent):void
         gotoAndPlay(1121);
    revolve4Button.addEventListener(MouseEvent.CLICK, gotorevolve4fromrevolve5);
    function gotorevolve1fromrevolve5 (event:MouseEvent):void
         gotoAndPlay(824);
    revolve5Button.addEventListener(MouseEvent.CLICK, gotorevolve1fromrevolve5);
    function GoToOptionsfromrevolve5(event:MouseEvent):void
        gotoAndPlay(2);
    optionsButton.addEventListener(MouseEvent.CLICK, GoToOptionsfromrevolve5);
    Is this just too much script and too many frames? It seems easy enough to execute, but I'm new to this. Anyone know what might be causing this unpredictable behavior? Thanks a lot,
    Jono

    While this may not address each condition of an error, one thing you said may be fixable by using gotoAndStop() rather than gotoAndPlay() in those event handler functions.  If you are stopped at a particular frame and execute a command to gotoAndPlay that same frame, the stop that might have kept it there is already used up, so it plays.
    I wish I could understand more about what controls the sliding out bit.  That's something that should be trackable as to what tells it to retract when it isn't supposed to.  If it is a timeline tween that makes it appear and go away, then something is probably telling it to play() when it shouldn't.

  • Export command syntax for OA page in R12

    Hi,
    Can any one let me know the syntax for export command for an OA page in R12.
    I have tried with the 11i export command, but could not get the page.
    Thanks,
    Divya

    Hi,
    sorry, wrong forum (its about JDeveloper and ADF) - See: OA Framework
    Frank

  • Export Command To Create FDF, In Adobe Pro 8

    I can't seem to find the export command to create a fdf in adobe pro 8. It works in adobe pro 6 when using the 'Execute a menu item' command.
    I can import fdf information within a JavaScript using the following code:
    this.importAnFDF("../Setup/Forms Setup Information.fdf");
    Is there a JavaScript that I can use to export fdf information similar to the code in the above line?

    Use the method exportAsFDF. Read the documentation in the JavaScript for Acrobat API Reference.

  • Export Web Application [EXPORT] Command in WAD

    HI Gurus,
    Here is a strange problem. I would like a real solution because I've read just about any documentation there is out there. So, thanks for thoughtful answers in advance.
    We created a custom template ZZANALYSIS_PATTERN off of SAP delivered standard 0ANALYSIS_PATTERN. We have also created a copy of 0ANALYSIS_PATTERN_EXPORT to ZZANALYSIS_PATTERN_EXPORT template in order to modify the look and feel of how reports look once exported to MS Excel.
    Here is the problem. When specifying parameters for Export command, we put ZZANALYSIS_PATTERN_EXPORT in the Internal Display -> Web Template section which forces it to go to that new custom template once user clicks on Export to Excel button vs going through a standard 0ANALYSIS_PATTERN_EXPORT template. The problem is that once that setting is specified when we test an extra pop-up window shows up when clickin on Export to Excel button - http://server/irj/servlet/prt/portal/prtroot/com.sap.ip.bi.web.portal.integration.launcher. That window is confusing to the users and it never closes, just blank window stays. Now, if we go back to Export command parameters in WAD and reset/leave the Default parameter setting for Internal Display section, no extra pop-up is showed and the user just gets open Excel file little window, which is what we need. So, that makes me wonder if there is some kind of backend config table where this Default setting could be changed from 0ANALYSIS_PATTERN_EXPORT to ZZANALYSIS_PATTERN_EXPORT, or maybe it is something in the BEx Web Analyzer iView in the portal.
    Thanks for your help.
    Andrei

    Hi Marcio,
    I have done this before long time ago. Yes there is no specific web item available to export as .txt ot TAB delimited file. But, it is possible from hard coding the HTML script in WAD to achieve it.

  • Same command works on the server but not on the workstation

    Scenario
    Exchange 2010 sp3
    Name                           Value
    CLRVersion                     2.0.50727.4984
    BuildVersion                   6.1.7600.16385
    PSVersion                      2.0
    WSManStackVersion              2.0
    PSCompatibleVersions           {1.0, 2.0}
    SerializationVersion           1.1.0.1
    PSRemotingProtocolVersion      2.1
    Worstation:
    Name                           Value
    PSVersion                      4.0
    WSManStackVersion              3.0
    SerializationVersion           1.1.0.1
    CLRVersion                     4.0.30319.18063
    BuildVersion                   6.3.9600.16406
    PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0}
    PSRemotingProtocolVersion      2.2
    I input a command from my workstation using ISE
    get-exchangeserver | Where-Object {$_.IshubtransportServer -eq $True} | Get-MessageTrackingLog -Start (get-date).AddHours(-2) -Sender '[email protected]'
    Error:
    The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the 
    parameters that take pipeline input.
    The Exact same command from the Server works.
    In my profile I have the following:
    set-Location c:\
    $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri 'http://MyExchangeServer/powershell'
    import-pssession $Session
    Import-Module ActiveDirectory
    $host.PrivateData.ErrorForegroundColor = 'green'
    Clear-Host
    From My Workstation:
    PS C:\> Get-PSSession
     Id Name            ComputerName    State         ConfigurationName     Availability
      1 Session1        MyExchangeServer  Opened        Microsoft.Exchange       Available
    Why  does the command work on the server but not on the workstation?
    Alexis

    Hi Tiri2014,
    There are some differences between Remote PowerShell and Exchange Management Shell. You cannot use the pipeline when you run some cmdlets by using Remote PowerShell.
    Here’s a link to similar case for your reference:
    Error message when you try to pipe the result of a cmdlet into another cmdlet by using Remote Powershell in Office 365 dedicated: “The input object cannot be bound to any parameters
    for the command”
    http://support.microsoft.com/kb/2701827/en-us
    Hope it helps
    Best regards

Maybe you are looking for

  • Data load from DSO to cube fails

    Hi Gurus, The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded. The DSO has been loaded without errors from ECC. Error Message say u201DTh

  • Error in executing RFC or CAF Application Service

    Hello Everybody, I am creating course approval process, in which I am calling RFC for user info and after course approval, calling CAF application service to persist this data. But, both services are not working in process. I have tested callable obj

  • Will project play in I.E. 7,8, or 9?

    I am creating a project for a client and I noticed that Captivate 6 choses my Firefox browser to display the project.  I have both Firefox and Internet Explorer on my computer.  My client has only I.E.. Will it show on I.E.?

  • Source List exit

    Hi, I am using tcode ME01 for updating Contract/Item No in the source list. I have to put some check before saving the source list.I am not able to find out the place where i can put check.Pls suggest the required User Exit/Badi if some one has used.

  • Selected printer connection method is incorrect

    Whenever I try to print to our networked Canon ir5020 copier, I get a message "The selected printer connection method is incorrect. Select the correct printer connection method in Printer List, and then try to add the printer." I have tried it with P