I have to generate a 4 char unique string from a long value

I got a requirment
I have to generate a 4 char unique string from a long value
Eeach char can be any of 32 character defined has below.
private static final char char_map[] = new char[]{'7','2','6','9','5','3','4','8','X','M','G','D','A','E','B','F','C','Q','J','Y','H','U','W','V','S','K','R','L','N','P','Z','T'};
So for 4 char string the possible combination can be 32 * 32 * 32 * 32 = 1048576
If any one passes a long value between 0 - 1048576 , it should generate a unique 4 char string.
Any one with idea will be a great help.

Well, a long is 64 bits. A char is 16 bits. Once you determine how you want to map the long's bits to your char bits, go google for "java bitwise operators".

Similar Messages

  • Generate fixed-length unique strings

    Hi all,
    I'm trying to generate 16 byte unique string from two input strings. Essentially, the unique string will be used as primary key in the database. The two input strings are (siteUrl, productId) in which siteUrl is the url of a website which has one or more productId. Each productId in a site is unique but there might be duplicate productIds from different sites. I want to generate 16 byte ids from each pair of (siteUrl, productId) such that they are unique (or have a very small chance of collision). Has anyone done this before? Please share your experience! Thanks heap!

    >>>>>
    KajThanks for the answer. However, what I want toknow
    is how to convert say productId to unique 8
    byte
    string. Any idea?What does the product id look like?Product id is usually a string of digits andletters:
    2323, 234lasfd1kj3,....
    What I'm looking for is a hash function h suchthat:
    h(siteUrl) -> 8 byte string
    h(productId) -> 8 byte string
    I now can combine h(siteUrl) and h(productId) toget
    16 byte unique string.
    Sorry but you can't! Since the hashes will not be
    unique the combination will not be unique.There should be such hash function somewhere but I haven't found it. The definition is here http://www.x5.net/faqs/crypto/q94.html.

  • Idoc( i have to generate two idoc

    hi experts,
          i have to generate two idoc at a time from outbound .how is it possible.at that time whatever data i am sending should be same.please help me.
                                                                                    thank you
    with regards,
    rohit

    Hi Rohit,
    It is not possible to generate two outbound IDocs at a time.
    But you can resend/reprocess using standard program RSNAST00 for sales order/invoice/delivery IDoc for example.
    What is the message type/IDoc type that you are dealing with?
    Regards,
    Ferry Lianto

  • Reporting Services R2 log entries have stopped generating on report execution.

    Hi -
    I have a new instance of a reporting server and the log files have quit generating on report execution. 
    When the services restart this is the log file:
    rshost!rshost!8e8!01/22/2014-17:08:33:: i INFO: CLR runtime is initialized.
    rshost!rshost!8e8!01/22/2014-17:08:33:: i INFO: Derived memory configuration based on physical memory as 3931704 KB
    rshost!rshost!8e8!01/22/2014-17:08:33:: w WARN: Failed to initialize performance counters: ffffffff. Possible setup issue.
    rshost!rshost!8e8!01/22/2014-17:08:33:: e ERROR: Failed to start service, error=1063.
    We have logging turned on on the server.  Report Server Mode is Native.  Dont see anything in the application log about failed service.
    Any ideas?

    Hi dwrco,
    Have you recently install any update such as MS12-070 (KB2716440)? If so, it is a known issue for some patches for SQL Server. The reason is as follows:
    When a .NET application starts up, the .NET Framework tries to check if the Microsoft assemblies are right and have not been modified, this is done through connecting to a server that has a revocation list in internet. When the framework tries to connect
    to the server to check if it has access to internet with an elevated timeout, this makes that any other .NET application that has less timeout setting fails (such as SSIS or SSRS) with timeout error.
    For example, the error message "The request failed or the service did not respond in a timely fashion." means that the SSRS timeout is lower than the timeout of the connection to the revoke list server established by .NET Framework.
    To work around the issue, we usually use two approaches:
    Increase the value of ServicesPipeTimeout registry key of Reporting Services
    Disable generatePublisherEvidence element of ReportingServicesService.exe.config file
    For the detailed steps, please see:
    http://blogs.msdn.com/b/mariae/archive/2012/11/12/reporting-services-service-doesn-t-start-after-the-installation-of-ms12-070-security-patch.aspx 
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • "No sensor values have been generated for this instance." BPELConsole msg.

    I've reworked the GoogleFlowWithSensors demo to use BAM sensors as a prelude to incorporating BAM into a a real BPEL process we have that's much more complicated. I've moved the schema in the GoogleFlowWithSensors.wsdl file to an imported XSD file as directed in literature. I created in BAM Architect a DataObject called ProcessExecutionTimestamp with 5 columns instanceId, receiveInput_TS, invokePartnerLink_TS, receivePartnerLink_TS, and callbackClient_TS. I was able to successfully tie the BAM sensors to this dataobject.
    Everytime I test the BPEL process in BPELConsole, I get the results back but I get nothing but a "No sensor values have been generated for this instance." message in the BPELConsole "sensors" pane. I have found nothing in any BAM or BPEL log that indicates what could be going on.
    Thanks in advance.
    Message was edited by:
    user565339

    Hi,
    BPEL Console does not show data published on BAM Sensors, to see on BPEL console you have use database sensor type.
    Bpel log (default.log) shows any errors encountered on sending data to BAM.
    Thanks

  • Receiving msgs that mail I HAVE NOT GENERATED cannot be delivered.  About 15 each day.  Is this a virus?

    Hi, I have received about 15 email messages a day from " Mail Administrator - Mail System Error Returned Mail" that emails I have not generated, cannot be delivered as the destination computer cannot be found!  These emails are in a foreign language, appears to be russian, and I have no clue who/what generates them.  It was suggested to me that it could be a virus.  I have done three scans and so far nothing malicious has been found.  Does anyone know how I can fix/stop these?
    Thank you
    Chrysi

    If this is someone else's windows machine, infected, and with your contact information in their contacts, then there is little if anything you can do.  Windows machines can often become infected by mail bots that randomly send mail from the infected computer to people in the contact's database, AND using email addresses found in those contacts to spoof the "from" field in the outgoing mail.  When those messages fail, the receiving server bounces them back to you as your email was in the "from" address field.
    The only solution would be to disinfect the infected computer, but since you have no idea who (amongst all those with your contact information) is the actual infected computer, you cannot really do anything.

  • Dispatch texts have sequential address - how to set unique ringtone?

    Good evening all,
    I'm a volunteer firefighter and our department utilizes a text-based dispatch system as one of the ways of notifying us.  The texts have a sequential number as part of the FROM address... i.e. [email protected]  One message may end with ###138 and the next may be ###165.  Obviously, this makes it difficult to set a contact and assign a unique ringtone as the address constantly changes.
    The question I have... how can I set a unique ringtone to a constantly changing address?????
    Thanks!
    Pete

    Yes. It's come up a few times.  The iPhone uses the caller ID string to associate an incoming call or message with a custom ring/text tone.  Unfortunalely, it can not base that on a partial number match. It must match the entire number.
    The only real solution would be to assign a custom tone to all of your contacts and a separate tone as the default. That causes its own issue as anything coming from anyone other than your contacts will use the default.

  • Insert into a table with unique columns from another table.

    There are two tables,
    STG_DATA                                                          
    ORDER_NO    DIR_CUST_IND
    1002                     DNA
    1005                     GEN
    1005    
    1008                     NULL
    1001                     NULL
    1001                     NULL
    1006                     NULL
    1000                     ZZZ
    1001                     ZZZ
    FACT_DATA
    ORDER_NO    DIR_CUST_IND
    1005                      NULL
    1006                      NULL
    1008                      NULL
    I need to insert only unique [ORDER_NO] from STG_DATA to FACT_DATA with corresponding [DIR_CUST_IND]. Though STG_DATA has multiple rows with same ORDER_NO, I need to insert only one of that it can be any record.
    Sarvan

    CREATE TABLE #Level(ORDER_NO INT, DIR_CUST_IND CHAR(3))
    INSERT #Level
    SELECT 1002,'DNA' UNION
    SELECT 1005,'GEN' UNION
    SELECT 1005,NULL UNION
    SELECT 1008,NULL UNION
    SELECT 1001,NULL UNION
    SELECT 1001,NULL UNION
    SELECT 1006,NULL UNION
    SELECT 1000,'ZZZ' UNION
    SELECT 1001,'ZZZ'
    SELECT ORDER_NO,DIR_CUST_IND
    FROM( SELECT ROW_NUMBER()OVER(PARTITION BY ORDER_NO ORDER BY ORDER_NO) RowNum,*
    FROM #Level)A
    WHERE RowNum=1
    I hope this would give you enough idea. All you have to do is just write insert statement.
    Next time please post DDL & DML.
    Chaos isn’t a pit. Chaos is a ladder. Many who try to climb it fail and never get to try again. The fall breaks them. And some are given a chance to climb, but they refuse. They cling to the realm, or the gods, or love. Illusions. Only the ladder is real.
    The climb is all there is.

  • Unable to generate XML's for BLOB datatypes from Concurrent program

    Hi All,
    I've a requirement to print images on rtf layout. Images are uploaded by end user through attahments men
    there are getting stored in fnd_lobs tables.
    for printing blob images we need to convert them into CLOB and generate XML's.
    I've done the conversion through a function and calling the function in the select query which is generating XML when i run it from toad.
    SELECT xmlgen.getXml(
    'SELECT file_id,mob_getbase64String(file_data) photo
    FROM fnd_lobs
    WHERE file_id = 2490481'
    ,0
    ) FROM dual;
    But the same thing we i registered as concurrent program (SQL*Plus) the program is running into error.
    Output file
    The XML page cannot be displayed
    Cannot view XML input using style sheet. Please correct the error and then click the Refresh button, or try again later.
    Invalid at the top level of the document. Error processing resource 'https://dbtdev5i.oracleoutsourcing.com/OA_CGI/FNDWRR.e...
    Input truncated to 17 characters
    ^
    Log file
    Concurrent Manager encountered an error while running SQL*Plus for your concurrent request 10868311.
    Review your concurrent request log and/or report output file for more detailed information.
    Can anyone help me through on how to bypass this error and generate XML's.
    Thanks in Advance
    Jana

    Hi Priya..,
    I have changed the query and registered in apps and now i am able to generate XML's of the blob image and the same is getting printed on the tempate..
    DECLARE
    v_colb CLOB;
    v_query VARCHAR2(1000);
    BEGIN
    v_query := 'SELECT file_id,mob_getbase64String(file_data) photo
    FROM fnd_lobs fl,
    fnd_documents_vl fd
    WHERE fd.media_id = fl.file_id
    AND (fd.end_date_active IS NULL
    OR fd.end_date_active > SYSDATE)
    AND fd.security_type = 2
    AND fd.security_id = fnd_profile.value(''GL_SET_OF_BKS_ID'')';
    --FND_FILE.put_line( FND_FILE.LOG,v_query);
    v_colb := xmlgen.getxml (v_query, 0);
    --DBMS_OUTPUT.put_line (v_query);
    FND_FILE.put_line( FND_FILE.OUTPUT,v_colb);
    END;
    /

  • Routine - get char 2 to 5 from the source value

    Helo friends,
    I have a requirement like below
    i have a char and data type is numeric(5) and value of the char is A1234
    now i need to load the data to DSO with value 1234 , since its datatype is numeric and i have to ignore 1st value i.e.  'A'
    so, i have to write a start routine in transformations to get the values from 2 to 5 of the source field, can anyone provide me the routine
    Thanks,
    SR

    Hi Stefan,
    Thanks for quick response.
    I written the same in transformations for that object, but it is throwing error as 'COMM_STRUCTURE-/bic/xxxx' is unknown. its neither in one of the specified tables nor defined by a 'data stmnt'
    i think the way we write routine in bi7 is different ?
    Thanks
    SR

  • Generate reports in PDF or EXCEL from Web App...

    Hello, I have to generate some reports from data base using web APP on Java, these reports with few generation criteria must be in PDF format and EXCEL format (both), someone knows a tool with which I can generate them?
    Best regards.

    PDF: iText
    Excel: Apache POI

  • Help needed to get unique record from an internal table

    Hi Everybody,
    I have to get unique record from an internal table. i know we can use read statement with key condition .
    But the problem is i have to use some relational operators like GE or LE.
    eg
    read table itab into wa with key width GE itab-widthfrom
                                                       width LE itab-widthto
                                                       machno eq itab-machno.
    Its giving me error when I use the operators GE , LE.( I think since it can't find a unique record with those relational
    operators in the with key statement)
    Is there any other way to get unique record from internal table without using the loop?
    Thanks,
    Sunny

    Using the read statement you will need some kind of loop. For example.
    DO.
    READ TABLE......
      WITH KEY ......
                        READ = SPACE
    IF SY-SUBRC EQ 0
      TABLE-READ = 'X'.
      MODIFY TABLE
      ADD 1 TO W_FOUND.
    ELSE
      EXIT.
    ENDIF
    ENDDO.
    IF W_FOUND EQ 1.
    ...record is unique.
    ENDIF.

  • ASCII character/string processing and performance - char[] versus String?

    Hello everyone
    I am relative novice to Java, I have procedural C programming background.
    I am reading many very large (many GB) comma/double-quote separated ASCII CSV text files and performing various kinds of pre-processing on them, prior to loading into the database.
    I am using Java7 (the latest) and using NIO.2.
    The IO performance is fine.
    My question is regarding performance of using char[i] arrays versus Strings and StringBuilder classes using charAt() methods.
    I read a file, one line/record at a time and then I process it. The regex is not an option (too slow and can not handle all cases I need to cover).
    I noticed that accessing a single character of a given String (or StringBuilder too) class using String.charAt(i) methods is several times (5 times+?) slower than referring to a char of an array with index.
    My question: is this correct observation re charAt() versus char[i] performance difference or am I doing something wrong in case of a String class?
    What is the best way (performance) to process character strings inside Java if I need to process them one character at a time ?
    Is there another approach that I should consider?
    Many thanks in advance

    >
    Once I took that String.length() method out of the 'for loop' and used integer length local variable, as you have in your code, the performance is very close between array of char and String charAt() approaches.
    >
    You are still worrying about something that is irrevelant in the greater scheme of things.
    It doesn't matter how fast the CPU processing of the data is if it is faster than you can write the data to the sink. The process is:
    1. read data into memory
    2. manipulate that data
    3. write data to a sink (database, file, network)
    The reading and writing of the data are going to be tens of thousands of times slower than any CPU you will be using. That read/write part of the process is the limiting factor of your throughput; not the CPU manipulation of step #2.
    Step #2 can only go as fast as steps #1 and #3 permit.
    Like I said above:
    >
    The best 'file to database' performance you could hope to achieve would be loading simple, 'known to be clean', record of a file into ONE table column defined, perhaps, as VARCHAR2(1000); that is, with NO processing of the record at all to determine column boundaries.
    That performance would be the standard you would measure all others against and would typically be in the hundreds of thousands or millions of records per minute.
    What you would find is that you can perform one heck of a lot of processing on each record without slowing that 'read and load' process down at all.
    >
    Regardless of the sink (DB, file, network) when you are designing data transport services you need to identify the 'slowest' parts. Those are the 'weak links' in the data chain. Once you have identified and tuned those parts the performance of any other step merely needs to be 'slightly' better to avoid becoming a bottleneck.
    That CPU part for step #2 is only rarely, if every the problem. Don't even consider it for specialized tuning until you demonstrate that it is needed.
    Besides, if your code is properly designed and modularized you should be able to 'plug n play' different parse and transform components after the framework is complete and in the performance test stage.
    >
    The only thing that is fixed is that all input files are ASCII (not Unicode) characters in range of 'space' to '~' (decimal 32-126) or common control characters like CR,LF,etc.
    >
    Then you could use byte arrays and byte processing to determine the record boundaries even if you then use String processing for the rest of the manipulation.
    That is what my framework does. You define the character set of the file and a 'set' of allowable record delimiters as Strings in that character set. There can be multiple possible record delimiters and each one can be multi-character (e.g. you can use 'XyZ' if you want.
    The delimiter set is converted to byte arrays and the file is read using RandomAccessFile and double-buffering and a multiple mark/reset functionality. The buffers are then searched for one of the delimiter byte arrays and the location of the delimiter is saved. The resulting byte array is then saved as a 'physical record'.
    Those 'physical records' are then processed to create 'logical records'. The distinction is due to possible embedded record delimiters as you mentioned. One logical record might appear as two physical records if a field has an embedded record delimiter. That is resolved easily since each logical record in the file MUST have the same number of fields.
    So a record with an embedded delimiter will have few fields than required meaning it needs to be combined with one, or more of the following records.
    >
    My files have no metadata, some are comma delimited and some comma and double quote delimited together, to protect the embedded commas inside columns.
    >
    I didn't mean the files themselves needed to contain metadata. I just meant that YOU need to know what metadata to use. For example you need to know that there should ultimately be 10 fields for each record. The file itself may have fewer physical fields due to TRAILING NULLCOS whereby all consecutive NULL fields at the of a record do not need to be present.
    >
    The number of columns in a file is variable and each line in any one file can have a different number of columns. Ragged columns.
    There may be repeated null columns in any like ,,, or "","","" or any combination of the above.
    There may also be spaces between delimiters.
    The files may be UNIX/Linux terminated or Windows Server terminated (CR/LF or CR or LF).
    >
    All of those are basic requirements and none of them present any real issue or problem.
    >
    To make it even harder, there may be embedded LF characters inside the double quoted columns too, which need to be caught and weeded out.
    >
    That only makes it 'harder' in the sense that virtually NONE of the standard software available for processing delimited files take that into account. There have been some attempts (you can find them on the net) for using various 'escaping' techniques to escape those characters where they occur but none of them ever caught on and I have never found any in widespread use.
    The main reason for that is that the software used to create the files to begin with isn't written to ADD the escape characters but is written on the assumption that they won't be needed.
    That read/write for 'escaped' files has to be done in pairs. You need a writer that can write escapes and a matching reader to read them.
    Even the latest version of Informatica and DataStage cannot export a simple one column table that contains an embedded record delimiter and read it back properly. Those tools simply have NO functionality to let you even TRY to detect that embedded delimiters exist let alone do any about it by escaping those characters. I gave up back in the '90s trying to convince the Informatica folk to add that functionality to their tool. It would be simple to do.
    >
    Some numeric columns will also need processing to handle currency signs and numeric formats that are not valid for the database inpu.
    It does not feel like a job for RegEx (I want to be able to maintain the code and complex Regex is often 'write-only' code that a 9200bpm modem would be proud of!) and I don't think PL/SQL will be any faster or easier than Java for this sort of character based work.
    >
    Actually for 'validating' that a string of characters conforms (or not) to a particular format is an excellent application of regular expressions. Though, as you suggest, the actual parsing of a valid string to extract the data is not well-suited for RegEx. That is more appropriate for a custom format class that implements the proper business rules.
    You are correct that PL/SQL is NOT the language to use for such string parsing. However, Oracle does support Java stored procedures so that could be done in the database. I would only recommend pursuing that approach if you were already needing to perform some substantial data validation or processing the DB to begin with.
    >
    I have no control over format of the incoming files, they are coming from all sorts of legacy systems, many from IBM mainframes or AS/400 series, for example. Others from Solaris and Windows.
    >
    Not a problem. You just need to know what the format is so you can parse it properly.
    >
    Some files will be small, some many GB in size.
    >
    Not really relevant except as it relates to the need to SINK the data at some point. The larger the amount of SOURCE data the sooner you need to SINK it to make room for the rest.
    Unfortunately, the very nature of delimited data with varying record lengths and possible embedded delimiters means that you can't really chunk the file to support parallel read operations effectively.
    You need to focus on designing the proper architecture to create a modular framework of readers, writers, parsers, formatters, etc. Your concern with details about String versus Array are way premature at best.
    My framework has been doing what you are proposing and has been in use for over 20 years by three different major nternational clients. I have never had any issues with the level of detail you have asked about in this thread.
    Throughout is limited by the performance of the SOURCE and the SINK. The processing in-between has NEVER been an issu.
    A modular framework allows you to fine-tune or even replace a component at any time with just 'plug n play'. That is what Interfaces are all about. Any code you write for a parser should be based on an interface contract. That allows you to write the initial code using the simplest possible method and then later if, and ONLY if, that particular module becomes a bottlenect, replace that module with one that is more performant.
    Your intital code should ONLY use standard well-established constructs until there is a demonstrated need for something else. For your use case that means String processing, not byte arrays (except for detecting record boundaries).

  • Finding unique object from a collection

    Hi,
    Is there a way to select all the unique objects from a pool of collections?
    ie, i have a set which contains (1,1, 2,3,4,4,5) and want to get (1,2,3,4,5) back? Is there a class or function that able to do that?
    thx in adv

    ie, i have a set which contains (1,1, 2,3,4,4,5)no you don't. sets, by definition, don't contain
    duplicatesNice george. I missed the part where the OP said he had a Set.

  • How to generate an Interupt for DMA transfer from Counter on NI-PCI-6602

    dear guys:
          how to generate an Interupt for DMA transfer from Counter on NI-PCI-6602,and I have set the DMA and DMA_INT, and also the global interrupt register.
    but there is no Interupt generated in the Interupt callback function.And when I have set the DMA_INT and global interrupt register ,and then read the relevant register,the relevant bit is also 0.
         I suspect there is an Interupt register contral, like the MITE, you must write the value to 0xc4 for opening .
         there is some codes In my enclosure .What can I do?
    Attachments:
    TEST.C ‏21 KB

    dear Steven_T:
            the registers are In my enclosure ,if you have some ideas,please reply me  first time.thank you !
    Attachments:
    PCI6602.pdf ‏818 KB

Maybe you are looking for

  • Print Detail Lines on One Page?

    Hi Friends I am creating xml template for Printed Purchase Order Report in Potrait Style. If the Detail Lines are more than 2, the third one is going to next page. I want all the Information Including Header and Detail LInes to be printed in one Page

  • LMS 3.2: RME failed archive with Nexus 5010

    Hi, LMS 3.2, RME 4.3.1, dev packages updated, Windows. We would like to archive the configuration of the Nexus 5010 switches. Due to authentication with Radius based on tokens we're not able to use telnet/SSH and therefore we only can use SNMP WRite

  • How to reorder superscript (endnote references) not that the document is type set?

    OK, did a DUMB thing.  I didn't stop to figure out how to use Peter Kahrel's script for endnotes BEFORE I went and typeset the 400+ page document and now some of the endontes need to be removed. Is there a GREP way (I dont' get the GREP thing yet, do

  • Changing current production SID to new SID.

    Hello there, I have a question on changing current SID name to new SID name. We are planning on moving our current production hardware to new better hardware and change the PRD(old) to PRD(new). Is anyone out there (customers) who had done this befor

  • Keynote keeps crashing when I try to add an image!

    I'm new to Keynote and trying to add an image, but every time I click "media" and then try to search for a photo in my downloads folder, Keynote crashes. I'm running Mavericks on OSX 10.9.3. What can I do to fix this?!