Regular expressions with dates and multiple matches

I am currently attempting to automate modifying start and end dates within a .config file via powershell but I am having issues identifying the regular expression for the end date section since both are on the same line in the file. Below is the string that
I want to change.
Sometimes the dates are blank and sometimes the dates are filled in.
Dates are always in the same format (yyyy-MM-dd hh:mm).
I also want to note that there are multiple instances of 'StartDate="" EndDate=""' for other applications throughout the same config file so I cannot limit the expression to not include the App name. 
I do not want to limit the search to a line number since there are instances where admins will add an extra space in the config file that may throw off the line number.
I want to replace the dates or lack there of in their respective spots on the line below via powershell:
 <App name="TestApp" StartDate="2012-03-22 13:30" EndDate="">
I am successfully able to use 
$startRegex = '(?<=<App name="TestApp" StartDate=")([^"]*)'
to replace the StartDate but I can't seem to single out the EndDate with regular expression. What expression can I use to have it ignore what is in the quotations after StartDate and only pay attention to the EndDate value?
Below is a snippet: 
$path = d:\inetpub\website\app.config
$startRegex = '(?<=<App name="TestApp" StartDate=")([^"]*)'
$starttime = (get-date).ToString("yyyy-MM-dd hh:mm")
(gc $path) -replace $startregex, $starttime | set-content $path
I want to accomplish the same for EndDate.
Thanks in advance!

If you do this with XML it will be painless and less prone to error.
$n=$xml.SelectSingleNode('//App[@name="TestApp"]')
$n.StartDate=$newstartdate
$n.EndDate=$newenddate
$xml.Save($filename)
\_(ツ)_/

Similar Messages

  • Regular Expression with comma and encapsulated charaters

    Would appreciate some help. Looking for a regular expression to remove comma's from encapsulated text as follows
    For example
    - Input
    1,"This is a string, need to remove the comma",Another text string,10
    - Required output
    1,"This is a string; need to remove the comma",Another text string,10
    Have tried to use the REGEXP_REPLACE but could not grasp the pattern matching.
    Thanks John

    John Heaton wrote:
    Thanks for the solution,this works great for a single field encapsulated by " and containing ,. I am parsing several different file definitions so it would need to cascade through the string for a undetermined number of times and replace all occurrences. Then try (performance-wise) MODEL solution:
    {code}
    with t as (
    select '1,"This is a string, need to remove the comma",Another text string,10' txt from dual union all
    select '1,"remove this comma,",Another text string,10,"remove this comma,",xxx,"remove this comma,",11' txt from dual
    select txt_original,
    txt
    from t
    model
    partition by(row_number() over(order by 1) p)
    dimension by(1 rn)
    measures(txt txt_original,txt,0 quote)
    rules
    iterate(
    1e9
    until(
    iteration_number + 1 = length(txt[1])
    quote[1] = case substr(txt[1],iteration_number + 1,1)
    when '"' then quote[1] + 1
    else quote[1]
    end,
    txt[1] = case substr(txt[1],iteration_number + 1,1)
    when ',' then case mod(quote[1],2)
    when 1 then substr(txt[1],1,iteration_number) || ';' || substr(txt[1],iteration_number + 2)
    else txt[1]
    end
    else txt[1]
    end
    TXT_ORIGINAL TXT
    1,"This is a string, need to remove the comma",Another text string,10 1,"This is a string; need to remove the comma",Another text string,10
    1,"remove this comma,",Another text string,10,"remove this comma,",xxx,"remove this comma,",11 1,"remove this comma;",Another text string,10,"remove this comma;",xxx,"remove this comma;",11
    SQL>
    SY.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Failed to acquire DAS with data sources that match the specified OCE

    Please help, Thank you all.
    We are migrating Oracle databases from Windows to Linux servers. Currently, I have no problem to use my desktop Hyperion Intelligence Designer 8.5 with new oce file to connect to Linux. However, I published the same oce file to Hyperion Performance Suite 8.5 then I got “Sever Error [2018]: Failed to acquire Data Access Service with data sources that match the specified OCE”.
    I launched Service Configurator, in the “Local Service Configurator - DAS1_host”, the “Database JDBC URL is point to Linux server. Also, I added ODBC connection to Oracle database on Linux in the “Properties of Data Access Service: DAS1_host’s Data Source”. Database of JDBC URL in BI1_host, AZ1_host, PUB1_host, SM1_host, LS1_host, UT1_host, BPS1_host, AN1_host are also point to Linux server.
    The “Remote Service Configurator” has ES1_host’s” Storage” with JDBC URL point to Linux server. The same for RM1_host’s and NS1_host’s “Storage” with JDBC URL point to Linux server.
    Is there some missing configuration on the server? Thank you in advance.

    You should login to my oracle support to look into this ID.
    This is the solution mentioned
    Solution
    1) In the .profile file, you will find the definition of the LD_LIBRARY_PATH variable. However, the path to 64 bit libraries is probably appended before the path to the 32 bit libraries(%ORACLE_HOME%/lib32). Therefore, please reverse the order in the definition so that the 32 bit libraries get loaded before the 64 bit libraries.
    Then, restart all services.
    2) Oracle 9.2 was the first Oracle release which defaults to using the 64 bit libraries in the 'lib' directory and the 32-bit libraries that we need are in the 'lib32' directory.
    Therefore, please have the%ORACLE_HOME%/lib32 placed in front of the LD_LIBRARY_PATH environment variable definition.Then, restart all services.
    3) Please check that users have read and write access to the files located in the %ORACLE_HOME%/lib32 folder.
    4) Check the permissions of the libclntsh.a file(located in the %ORACLE_HOME%/lib32 folder) are set to rwxr-xr-x
    5) Recreate the OCE connection in DAS and restart DAS in CMC.
    6) Restart all the BIPlus Services and process the query.

  • Help on export sybase iq tables with data and import in another database ?

    Help on export Sybase iq 16 tables with data and import into another database ?

    Hi Nilesh,
    If you have table/index create commands (DDLs), you can create them in Developper and import data using one of methods below
    Extract/ Load table
    Insert location method : require IQ servers to be entered in interfaces file
    Backup/Restore : copy entire database content
    If you have not the DDLs, you can generate them using IQ cockpit or SCC.
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01773.1604/doc/html/san1288042631955.html
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01840.1604/doc/html/san1281564927196.html
    Regards,
    Tayeb.

  • Tolerance Limits with Date and quantity

    Hi Experts,
    We would like to set the GR Tolerance Limits with Date and qty.
    how to set the Tolerance Limits for Date and quantity..??
    if we created PO 100qty and date 01-05-009.
    if we receive the goods on 20-04-2009 and 50qty...
    For GR .. Migo should be give error message..
    where can i set this Tolerance Limit Configuration.... can u please explin me..
    Thanks in Advance,
    Anthyodaya .

    Go to SPRO - Material Managemetn - LIV - Invoice Block - Set Tolerence limit.
    USe company code and maintain a value for ST
    ST: Date variance (value x days)
    The system calculates for each item the product of amount * (scheduled delivery date - date invoice entered) and compares this product with the absolute upper limit defined. This allows relatively high schedule variances for invoice items for small amounts, but only small schedule variances for invoice items for large amounts.

  • Tp ended with error code 0247 - addtobuffer has problems with data- and/or

    Hello Experts,
    If you give some idea, it will be greatly appreciated.
    This transported issue started coming after power outage, sap system went hard shutdown.
    Then we brought up the system. Before that , we do not have this transport issue.
    our TMS landscape is
    DEV QA-PRD
    SED-SEQSEP
    DEV is having the TMS domain controller.
    FYI:
    *At OS level, when we do scp command using root user, it is fine for any TR.
    In STMS, while adding TR in SEQ(QA system), we  are getting error like this.
    Error:
    Transport control program tp ended with error code 0247
         Message no. XT200
    Diagnosis
         An error occurred when executing a tp command.
           Command:        ADDTOBUFFER SEDK906339 SEQ client010 pf=/us
           Return code:    0247
           Error text:     addtobuffer has problems with data- and/or
           Request:        SEDK906339
    System Response
         The function terminates.
    Procedure
         Correct the error and execute the command again if necessary.
    This is tp version 372.04.71 (release 700, unicode enabled)
    Addtobuffer failed for SEDK906339.
      Neither datafile nor cofile exist (cofile may also be corrupted).
    standard output from tp and from tools called by tp:
    tp returncode summary:
    TOOLS: Highest return code of single steps was: 0
    ERRORS: Highest tp internal error was: 0247

    when we do scp using sm69,
    SEDADM@DEVSYS:/usr/sap/trans/cofiles/K906339.SED SEQADM@QASYS:/usr/sap/trans/cofiles/.
    it throws the error like below,
    Host key verification failed.
                                                                                    External program terminated with exit code 1
    Thanks
    Praba

  • How to forward sms messages via email with date and contact received info

    Does anyone know how to forward sms messages via email with date and contact received info.
    Currently when I forward only the body copy of the sms message is sent in the email.

    This is not currently possible. Sorry.
    1. If any post helps you please click the below the post(s) that helped you.
    2. Please resolve your thread by marking the post "Solution?" which solved it for you!
    3. Install free BlackBerry Protect today for backups of contacts and data.
    4. Guide to Unlocking your BlackBerry & Unlock Codes
    Join our BBM Channels (Beta)
    BlackBerry Support Forums Channel
    PIN: C0001B7B4   Display/Scan Bar Code
    Knowledge Base Updates
    PIN: C0005A9AA   Display/Scan Bar Code

  • Using expdp to export a mix of tables with data and tables without data

    Hi,
    I would like to create a .dmp file using expdp, exporting a set of tables with data and another set without data. Is there a way to do this in a single .dmp file? For example, I want all the tables in a schema with data, but for the fact tables in that schema, I only want the fact table objects, not the data. I thought it might be easier to create two separate .dmp files, one for each scenario, but would be nice to have one .dmp file that satisfies my requirement. Any help is appreciated.
    Thanks,
    -Rodolfo
    Edited by: user6902559 on May 11, 2010 12:05 PM

    You could do this with where clauses. Let's say you have 10 tables to export, 5 with data and 5 without data. I would do it like this
    tab1_w_data
    tab2_w_data
    tab3_w_data
    tab4_w_data
    tab5_w_data
    tab1_wo_data
    tab2_wo_data
    tab3_wo_data
    tab4_wo_data
    tab5_wo_data
    I would make one generic query
    query="where rownum = 0"
    and I would make 5 specific queries
    query=tab1_w_data:"where rownum > 0"
    query=tab2_w_data:"where rownum > 0"
    query=tab3_w_data:"where rownum > 0"
    query=tab4_w_data:"where rownum > 0"
    query=tab5_w_data:"where rownum > 0"
    The first query will be applied to all tables that don't have their own specific query and it will export no rows, the next 5 will apply to each of the corresponding table.
    Dean

  • Set up Tolerance Limits with Date and quantity for GR

    Hi Experts,
    We would like to set the GR Tolerance Limits with Date and qty.
    how to set the Tolerance Limits for Date and quantity..??
    if we created PO 100qty and date 01-05-009.
    if we receive the goods on 20-04-2009 and 50qty...
    For GR .. Migo should be give error message..
    where can i set this Tolerance Limit Configuration.... can u please explin me..
    Thanks in Advance,
    Anthyodaya .

    hi,
    You can do one thing...
    Activate the SLED check...
    BY this when the date goes earlier or late from the expected date, then system show you the messages like:
    1. Earliest delivery date is DD.MM.YY..
    2. Expected delivery date was DD.MM.YY...
    In std sap system, the second message is already error message...so when get late delivery then it'll not be possible to maintain the GR...
    For the qty varinace, activate the quantity variance either via over/under delievery tolerances..or
    Via SPRO settings...
    SPRO >> MM >> LIV >> Invoice block >> check for qty tolerances and make the settings as per your requirement...
    Regards
    Priyanka.P

  • Export with DATA and DDL

    Hi,
    in 8i version, I have to export a table with DATA and DDL. I wonder if following Command line is enough or I should add some other option ?
    exp sys/***@DB TABLES=MYTABLE file=d:\MYTABLE.dmp
    Many thanks.

    or I should add some other option ?
    exp sys/***@DB TABLES=MYTABLE
    file=d:\MYTABLE.dmp Yup, this enough, it will bring DLL for indexes on table MYTABLE as well
    Virag

  • HR -- adding year with date and month

    Hi all,
    Can any one let me know what is the logic for adding Year with date and month in HR.
    ex: 01.10.2008   from this date it should be 2 years.
    thnks
    joshi.

    Hi
    u can add days to the date directly ....
    if u have the date V_DATE of type sy-datum u can add directly number of days to get disired date ..
    v_date1 =  v_date + 365 .
    hope it helps ..........

  • I have a question. I don't know what happened, but my safari went from regular safari with google and now everything is yahoo. maybe it was firefox or something. I just want the regular safari back. Can someone help me?

    Frustrated!!!!!!,
    I have a question. I don't know what happened, but my safari went from regular safari with google and now everything is yahoo. maybe it was firefox or something. I just want the regular safari back. Can someone help me?

    Check the links below for options to remove the Adware.
    The Easy, safe, effective method:
    http://www.adwaremedic.com/index.php
    If you are comfortable doing manual file removals use the somewhat more difficult method:
    http://support.apple.com/en-us/HT203987
    Also read the articles below to be more prepared for the next time there is an issue on your computer.
    https://discussions.apple.com/docs/DOC-7471
    https://discussions.apple.com/docs/DOC-8071

  • Why Apple doesn't provide .jpg-files for photos with date and our like Blackberry and Samsung?

    why Apple doesn't provide .jpg-files for photos with date and our like Blackberry and Samsung?
    I have to save photos (for medical use) on my pc and this suffix would be useful to identify them quicker.
    Thanks

    http://www.learn.usa.canon.com/resources/blogs/2014/20140708_winston_filenames_b log.shtml

  • Regular expressions with boolean connectives (AND, OR, NOT) in Java?

    I'd like to use regular expression patterns that are made up of simple regex patterns connected via AND, OR, or NOT operators, in order to do some keyword-style pattern matching.
    A pattern could look like this:
    (.*Is there.*) && (.*library.*) && !((.*badword.*) || (^$))
    Is there any Java regex library that allows these operators?
    I know that in principle these operators should be available, since Regular languages are closed under union, intersection, and complement.

    AND is implicit,
    xy -- means x AND yThat's not what I need, though, since this is just
    concatenation of a regex.
    Thus, /xy/ would not match the string "a y a x",
    because y precedes x.So it has to contain both x and y, but they could be
    in any order?
    You can't do that easily or generally.
    "x.*y|y.*x" wouldll work here, but obviously
    it will get ugly factorially fast as you add more
    terms.You got that right: AND means the regex operands can appear in any order.
    That's why I'm looking for some regex library that does all this ugly work for me. Again, from a theoretical point of view, it IS possible to express the described semantics of AND with regular expressions, although they will get rather obfuscated.
    Unless somebody has done something similar in java (e.g., for C++, there's Ragel: http://www.cs.queensu.ca/~thurston/ragel/) , I will probably use some finite-state-machine libraries and compile the complex regex's into automata (which can be minimized using well-defined operations on FSMs).
    >
    You'd probably just be better off doing multiple
    calls to matches() or whatever. Yes, that's another possibility, do the boolean operators in Java itself.
    Of course, if you
    really are just looking for literals, then you can
    just use str.contains(a) && !str.contains(b) &&
    (str.contains(c) || str.contains(d)). You don't
    seem to need regex--at least not from your example.OK, bad example, I do have "real" regexp's in there :)

  • Email Regular Expression with a String.Match()

    I'm currently using a RichTextEditor for a user to build HTML
    for a site. However, I want the application to scan for emails and
    encode them so they are protected from spam bots when they go to
    the live site. I've written a regular expression to find an email
    and it seems to work, but it only returns one email at a time from
    the string. I have had to revert to a while loop to traverse the
    string until I'm satisfied. I don't particularly like that method
    and would like to just do one String.match() query to retrieve all
    of the emails. Can anyone see something here that I'm missing?

    Try adding the global flag (g):
    var emailPattern:RegExp =
    /[a-z][\w.-]+@\w[\w.-]+\.[\w.-]*[a-z][a-z]+/g;
    TS

Maybe you are looking for