Solaris 8 07/01 installation : Fast Data Access MMU Miss

hello,
Because of a hardware crash of my 9.1Gb hard drive, I plugged a brand new 40Gb drive in my Ultra 10 workstation.
I tried to re-install my release of Solaris 8 (06/00), but this release detect only a 5Gb hard drive (but the installation work).
I downloaded and burned Solaris 8, release 07/01 to break the 32Gb/drive limit, but when I type 'boot cdrom' or 'boot cdrom install' at the bios prompt, I get :
Fast Data Access MMU Miss.
I upgraded the PROM to the lastest version, and tried to reburn the CD.
Does someone has any idea to help me ?
Greg.

974009 wrote:
Hello all,
I have been working on a Sun Netra T1 105 for the past two weeks That system is too old for Solaris 11 and Solaris 10. That generation UltraSPARC module isn't going to be recognized by those newer OS's.
I suggest you go to your preferred online auction web site (for example, eBay) and get any Solaris 8(SPARC) or Solaris 9(SPARC) for that little 1U box.
I have disabled auto boot so when it starts it is going to the ok prompt.Smart. Good move for troubleshooting.
The CD of Solaris 11 is a new one that I burned just two weeks ago and I burned it on 24x speed.
And it is the Sparc version that I downloaded.
I also tried installing a DVD-ROM with the Solaris 10 DVD but it gave the same error.I hope that means you replaced the original CD reader with a DVD-capable drive.
When I type probe-ide it shows that the seconday Master is a Removable ATAPI Model: TOSHIBA CDROM XM-7002BcYes. Since that model computer was discontinued in 2002, there was only a CD-capable drive ever qualified to be used in it.
When I type probe-scsi it shows:
Target 0
Unit 0 Disk Seagate ST39103LCSUN9.0G034AA 9GB disk isn't going to be large enough for much more than just a core install of Solaris. You may again need to get larger drive(s) at that same online auction web site. The chassis can hold two internal disks.
Thank you very much.
Hendrik
Edited by: 974009 on 29-Nov-2012 07:34When you get the chance, glance at this link to a copy of the Netra T105's resource page as it appeared in the old Sun System Handbook (SSH). The link is to a non-Oracle site that mirrors the old now-discontinued free version of that SSH.
Next, while there's not much system documentation for that box offered by Oracle, you can glance through what there happens to be here:
http://docs.oracle.com/cd/E19628-01/index.html

Similar Messages

  • Fast instruction Access MMU miss

    Hello,
    Trying to install solaris 9, 4/04 and get fast instruction access mmu miss.
    Then I also get this:
    ERROR: Could not open file (/a/etc/vfstab)
    ERROR: Could not create the file system mount table (/etc/vfstab)
    ERROR: System installation failed
    and my vfstab looks like this:
    #device device mount FS fsck mount mount
    #to mount to fsck point type pass at boot options
    /proc - /proc proc - no -
    fd - /dev/fd fd - no -
    swap - /tmp tmpfs - yes -
    swap - /tmp tmpfs - no -
    /proc - /proc proc - no -
    /devices/pci@1e,600000/ide@d/sd@0,0:b - / ufs - no ro
    /devices/pci@1e,600000/ide@d/sd@0,0:a - /cdrom hsfs - no ro
    fd - /dev/fd fd - no -
    Very strange. Could someone help me? What should I do?
    BR Nadia

    I think it is something to do with your boot-device bring it down to 'ok'
    Run printenv to see the boot devices.
    ok printenv boot-device
    boot-device = disk disk2 net
    Run the devalias command to see the path for the boot-device:
    ok devalias disk
    disk /sbus/espdma@e,8400000/esp@e,8800000/sd@0,0
    Review the path to ensure that it is the correct path to the boot device.
    For example, if the path came back:
    disk /sbus/espdma@e,8400000/esp@e,8800000/sd@0,0:c
    The :c is telling you slice 2. Usually slice 2 is the backup slice and is the entire disk, not just the root slice. Hence, the device alias is improperly set for root.
    To find out what the correct path is, boot the system to single user from cdrom, then type 'format' to find which slice is the correct root partition, then try to manually mount it.
    say for example you find that slice '0' is the correct partition then umount and init 0 to go back to 'ok'
    then do a nvalias
    disk3 /sbus/espdma@e,8400000/esp@e,8800000/sd@0,0:a
    then boot and try again

  • Extremely slow data access on auto detected NAS drive

    I have a 27 inch iMac connected to a home network, on which I recently installed a Seagate Central NAS drive.  When Finder automatically finds the drive under Shared, it wants to automatically log me into the drive as "Guest" instead of the administrator user I created on the Seagate drive.  As Guest, the access time is extremely slow (takes minutes to display a list of folders).  However, when I use Finder's Go>Connect to Server, it allows me to enter and remember my userid and password, and the response time in accessing folders and files is extremely fast.  How can I get the auto detection to allow me to login with my userid and password in the same manner that I login using Go>Connect to Server, so I get fast data access on the drive?

    Hi, see if this helps...
    Do the Connect to Server route, then...
    If they in fact do show up on the Desktop, then create a new Folder maybe named Shares on/in the Desktop, drag all those Shares shown on the Desktop to that Folder, drag that Folder to the right side of the Dock between Applications & Trash, this should give you a popup menu to select from.:-)
    Mount that Share once, then once the Globe Icon shows up on the Desktop, drag that Globe Icon to the right side of the Dock between Applications & Trash. You now have a Dynamic Mount, a quick click on the Icon in the Dock will connect you.
    On the multiple Shortcuts...
    Mount them all, when show up on the Desktop, then create a new Folder maybe named Shares on/in the Desktop, drag all those Shares shown on the Desktop to that Folder, (you can even rename them once there, if you wish), drag that Folder to the right side of the Dock between Applications & Trash, this should give you a popup menu to select from.:-)
    To automount them, drag those Aliases to the Accounts Pref Pane's Login Items Window.

  • Solaris 8 hangs and data access error on reboot

    Hi
    using solaris 5.8 on UltraSPARC-IIi 360MHz.
    I know it is an old hardware but similiar hardware is running fine. here is the issue,
    System booted : works okay for some time then display was hung so system was rebooted ..it gave data access error. again rebooted and it came up fine..
    I have pasted the /var/adm/messages here..
    http://pastebin.com/m5d2d1d53
    Is it a CPU problem or Hard disk or memory issue ? or something else ?
    what could be solution , a kernel patch is an option ?
    uname -a
    SunOS sparc5 5.8 Generic_108528-07 sun4u sparc SUNW,Ultra-5_10
    test-all at OBP did not show any error.. passed all.
    Thanks

    I cant see the /var/adm/messages output, Since you have already performed a Full Diagnostic check, then you boot your solaris using CD and then perform full FSCK on root File Systems and see any errors get reported. Keep doing fsck 2 -3 times untill all the errors are clean.
    Also you must check the latest patches available for this system architecture.

  • How to Install Oracle Data Access Components (ODAC) on Windows Server 2003?

    I recently installed "32-bit Oracle Data Access Components (ODAC) with Oracle Developer Tools for Visual Studio" on my computer (Windows 7, 64bit). Everything seems fine and I can develop and run my application in Visual Studio 2010 and IIS 7.
    Now, when I deploy my application to the Server, it raises error:
    Exception: System.TypeInitializationException: The type initializer for 'Oracle.DataAccess.Client.OracleCommand' threw an exception. ---> Oracle.DataAccess.Client.OracleException: The provider is not compatible with the version of Oracle client
    Obviously I need to install ODAC on the server, too. My server is:
    - Windows 2003 32 bit R2 (I know, I know!)
    - IIS 6
    So. I downloaded the same installation from Oracle website (ODAC 11.2 Release 5 and Oracle Developer Tools for Visual Studio [11.2.0.3.20]) and installed it on the server. But still getting the same error.
    PS: When I was installing, I chose Oracle Data Access Component for Oracle Client 11.2.0.3.20 in Oracle Universal Installer. hmmmm. Should I choose "Oracle Server" instead? Screenshot
    Edited by: 1000434 on Apr 17, 2013 6:35 AM
    Edited by: 1000434 on Apr 17, 2013 6:36 AM

    ODP.NET, Unmanaged Driver uses unmanaged dependencies in the Oracle Client. The error you see means you have another Oracle Client installed on the Win2003 machine and ODP.NET is attempting to load the incorrect Oracle Client version, rather than the version you installed ODP.NET with.
    What you need to do is direct ODP.NET where to find the correct version of its unamanaged Oracle Client dependencies. This will be generally the bin directory of your Oracle Client home that was installed with ODP.NET.
    You can learn more about DllPath here:
    http://docs.oracle.com/cd/E20434_01/doc/win.112/e23174/InstallODP.htm#sthref94
    If you're not familiar with how to set ODP.NET settings in the Registry or .NET config files, you can read how to do that here:
    http://docs.oracle.com/cd/E20434_01/doc/win.112/e23174/featConfig.htm#sthref106

  • Windows 7, file properties - Is "date accessed" ALWAYS 100% accurate?

    Hello,
    Here's the situation: I went on vacation for a couple of weeks, but before I left, I took the harddrive out of my computer and hid it in a different location. Upon coming back on Monday (January 10, 2011) and putting the harddrive back in my computer, I
    right-clicked on different files to see their properties. Interestingly enough, several files had been accessed during the time I was gone! I right-clicked different files in various locations on the harddrive, and all of these suspect files had been accessed
    within a certain time range (Sunday, ‎January ‎09, ‎2011, approximately ‏‎between 6:52:16 PM - 9:06:05 PM). Some of them had been accessed at the exact same time--down to the very second. This makes me think that someone must have done
    a search on my harddrive for certain types of files and then copied all those files to some other medium. The Windows 7 installation on this harddrive is password protected, but NOT encrypted, so they could have easily put the harddrive into an enclosure/toaster
    to access it from a different computer.
    Of course I did not right-click every single file on my computer, but did so in different folders. For instance, one of the folders I went through has different types of files: .mp3, ,prproj, .3gp, .mpg, .wmv, .xmp, .txt with file-sizes ranging from 2 KB
    to 29.7 MB (there is also a sub-folder in this folder which contains only .jpg files); however, of all these different types of files in this folder and its subfolder, all of them had been accessed (including the .jpg files from the sub-folder) EXCEPT the
    .mp3 files (if it makes any difference, the .mp3 files in this folder range in size from 187 KB to 4881 KB). Additionally, this sub-folder which contained only .jpg files (48 .jpg files to be exact) was not accessed during this time--only the .jpg files within
    it were accessed-- (between 6:57:03 PM - 6:57:08 PM).
    I thought that perhaps this was some kind of Windows glitch that was displaying the wrong access date, but then I looked at the "date created" and "date modified" for all of these files in question, and their created/modified dates and
    times were spot on correct.
    My first thought was that someone put the harddrive into an enclosure/toaster and viewed the files; but then I realized that this was impossible because several of the files had been accessed at the same exact time down to the second. So this made me think
    that the only other way the "date accessed" could have changed would have been if someone copied the files.
    Is there any chance at all whatsoever that this is some kind of Windows glitch or something, or is it a fact that someone was indeed accessing my files (and if someone was accessing my files, am I right about the files in question having been copied)? Is
    there any other possibility for what could have happened?
    Do I need to use any kinds of forensics tools to further investigate this matter (and if so, which tools), or is there any other way in which I can be certain of what took place in that timeframe the day before I got back? Or is what I see with Windows 7
    good enough (i.e. accurate and truthful)?
    Thanks in advance, and please let me know if any other details are required on my part.
    P.S. The harddrive is NTFS.

    Never mind.  Someone else already answered this for me:
    "I use last accessed-created date time stamps all the time when troubleshooting-investigating software installs, its been very accurate in my uses of it in NTFS file systems.
    Since some of the dates are while you were gone, I assume it was several days to a week, I would say someone did access those files.
    These timestamps along with other data are used by computer forensics teams to reconstruct what a user did on a computer.
    Experienced Hackers use software to alter these time stamps to cover their tracks when breaking into computer systems."
    http://superuser.com/questions/232143/windows-7-file-properties-is-date-accessed-always-100-accurate/232320#232320
    Additionally, I just now found out what happened. Someone else found the harddrive and thought it was theirs (since it was identical to one they had been missing), so they put it in their computer and scanned it with an anti-virus software. They realized it
    wasn't theirs and then put it back in the place I had hidden it.
    In my original question, I stated a possible theory that someone may have copied the files, but I later realized that copying in itself doesn't affect the "date accessed" of the original/source file, but rather only the "date accessed" of
    the copy.
    Thanks to all those that may have read my question.

  • Issue while Installing Oracle Data Access Software for Windows

    All,
    Iam getting the following error while installing Oracle Data Access Software for windows. Iam installing in WindowsXP, with Oracle 9i release 9.2.0.7.0 DB and client in the same Box.
    It shows
    The Specified Key key was not found while trying to GetValue
    * Stop installation of all Products
    * Stop installtion of this componenent only.
    Kindly let me know why this error is showing up.
    Regards
    Ramesh

    Most probably you have hit this issue:
    "If you have more than one Oracle Home installed on the same machine (e.g. Oracle8i client and Oracle9i Release 2 client), use the Oracle Home Selector to run your applications with Oracle9i Release 2 client. "
    As documented on the Oracle Data Access Software for Windows. Release 9.2.0.4.0
    ~ Madrid.

  • No Setup.exe in 64-bit Oracle Data Access Components (ODAC) Dowload

    Why is there no Setup.exe in the 64-bit Oracle Data Access Components (ODAC) Download? Are the only instructions for installing in the dowloaded readme file?

    You're probably looking at an XCOPY bundle.
    If you're looking for an x64 11.2 bundle installed via the Oracle Installer, you'd need to get either the 11201 full client on OTN, or the 11202 full client on My Oracle Support. The 11202 full client includes support for .NET 4 if you need that.
    Hope it helps,
    Greg

  • Last Trap: Data Access Error

    Hello,
    I don't know what the appropropriate forum to ask this would be; please
    suggest another one if this is not the best one.
    We have a PCI-Express board (that we developed) that we're attempting to
    use in a Sparc system. It is an Ultra 25 (also says Ultra SPARC IIIi).
    We've attempted to use both Solaris 9 and 10.
    With our board in an x8 slot, the system will not boot. If our board
    is in an x16 slot, the system boots okay and the system can access
    the board properly. Our board is an x1 board.
    The following error message is seen during boot attempts:
    "ERROR:/pci@e,600000/pci@1/pci@0/usb@1c,1: Last Trap: Data Access Error"
    In addition, systems are able to boot and access our boards properly when the
    boards are in x4, x8, and x16 slots of x86 systems running Windows and other
    operating systems.
    Could anyone suggest what might be wrong?
    Thanks,
    Nathan

    Hello Sane,
    I'm seeing a wierd problem on a new v890 server...
    Record the current OBP settings, then issue set-defaults.
    If the problem persists, open a service case.
    Michael

  • Data Access Error when importing Standard Reports

    After installation, when I try to import standard reports, I got a "Data Access Error".
    FDM 9.3.1
    Oracle 10g
    In the fdm.err, I find the following description:
    ORA-12899 : too long value for "MYFDMSCHEMA"."TREPORT"."REPORTDESC" (real 152, max 75).
    It seems that reports descriptions are too long for my Oracle repository...
    Any idea ?
    Regards

    Yes.
    Please take the time to flag this response as helpful or correct if it helps you address the issue
    http://forums.oracle.com/forums/ann.jspa?annID=1184
    Edited by: TonyScalese on Dec 2, 2010 11:52 AM

  • Generic Data Access For All Class

    Hello
    I am doing one experiment on Data Access. In traditional system We have to write each Insert, Update, Delete code in data access for each table.
    My City Table Class:
    public class TbCitiesModel
    string _result;
    int _cityID;
    int _countryID;
    string _name;
    int _sortOrder;
    bool _enable;
    DateTime _createDate;
    string _countryName;
    public string result
    get { return _result; }
    set { _result = value; }
    public int cityID
    get { return _cityID; }
    set { _cityID = value; }
    public int countryID
    get { return _countryID; }
    set { _countryID = value; }
    public string name
    get { return _name; }
    set { _name = value; }
    public int sortOrder
    get { return _sortOrder; }
    set { _sortOrder = value; }
    public bool enable
    get { return _enable; }
    set { _enable = value; }
    public DateTime createDate
    get { return _createDate; }
    set { _createDate = value; }
    public string countryName
    get { return _countryName; }
    set { _countryName = value; }
    Traditional Code:
    public List<TbCitiesModel> DisplayCities()
    List<TbCitiesModel> lstCities = new List<TbCitiesModel>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand("STCitiesAll", connection))
    command.CommandType = CommandType.StoredProcedure;
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    lstCities.Add(new TbCitiesModel());
    lstCities[lstCities.Count - 1].cityID = Convert.ToInt32(reader["cityID"]);
    lstCities[lstCities.Count - 1].countryID = Convert.ToInt32(reader["countryID"]);
    lstCities[lstCities.Count - 1].name = Convert.ToString(reader["name"]);
    lstCities[lstCities.Count - 1].sortOrder = Convert.ToInt32(reader["sortOrder"]);
    lstCities[lstCities.Count - 1].enable = Convert.ToBoolean(reader["enable"]);
    lstCities[lstCities.Count - 1].createDate = Convert.ToDateTime(reader["createDate"]);
    return lstCities;
    The above code is used to fetch all Cities in the table. But when There is another table e.g  "TBCountries" I have to write another method to get all countries. So each time almost same code but just table and parameters are changing.
    So decided to work on only one global Method to fetch data from Database.
    Generic Code:
    public List<T> DisplayCitiesT<T>(T TB, string spName)
    var categoryList = new List<T>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand(spName, connection))
    command.CommandType = CommandType.StoredProcedure;
    foreach (var prop in TB.GetType().GetProperties())
    string Key = prop.Name;
    string Value = Convert.ToString(prop.GetValue(TB, null));
    if (!string.IsNullOrEmpty(Value) && Value.Contains(DateTime.MinValue.ToShortDateString()) != true)
    command.Parameters.AddWithValue("@" + Key, prop.GetValue(TB, null));
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    int i = 0;
    TB = Activator.CreateInstance<T>();
    int colCount = reader.FieldCount;
    foreach (var prop in TB.GetType().GetProperties())
    if (prop.Name != "result" && i <= (colCount - 1))
    prop.SetValue(TB, reader[prop.Name], null);
    i++;
    categoryList.Add(TB);
    return categoryList.ToList();
    Calling method:
    TbCitiesModel c = new TbCitiesModel();
    Program p = new Program();
    List<TbCitiesModel> lstCities = p.DisplayCitiesT<TbCitiesModel>(c,"STCitiesAll");
    foreach (TbCitiesModel item in lstCities)
    Console.WriteLine("ID: {0}, Name: {1}", item.cityID, item.name);
    Now Its working fine but I have tested with 10,00,000 Records in TBCities Table following are the result.
    1. The Traditional method took almost 58 - 59 -  58 - 59 - 59 seconds for 5 time
    2. The Generic Method is took 1.4 - 1.3 - 1.5 - 1.4 - 1.4  [minute.seconds]
    So by the results of test is generic method is probably slower in performance [because its have 3 foreach loops] but the data is very big almost 10,00,000 lakes records. So it might work good in lower records.
    1. So My question is can I used this method for real world applications ?? Or is there any performance optimization for this method?
    2. Also we can use this for the ASP.NET C# projects??
    Owner | Software Developer at NULLPLEX SOFTWARE SOLUTIONS http://nullplex.com

    Hi
    Mayur Lohite,
    Q1:It is not reasonable compared Generic Code with Traditional Code. The main issue not Generic.
    After take a look at your Generic Code.  Reflection code get slower in performance.
    TB = Activator.CreateInstance<T>();
    As Reflection is truly late bound approach to work with your types, the more Types you have for your single assembly the more slow you go on. Basically few people try to work everything based on Reflection. Using reflection unnecessarily will make your application
    very costly.
    Here is a good article about this issue, please take a look.
    Reflection is Slow or Fast? A Practical Demo
    Q2:Or is there any performance optimization for this method?
    The article presents some .NET techniques for using Reflection optimally and efficiently.
    optimizing object creation with reflection
    Note optimize
    reflection with Emit
    method.
    Best regards,
    Kristin
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.
    Hello Kristin
    Please can you tell how I optimize reflection in my code.
    public List<T> DisplayCitiesT<T>(T TB, string spName)
    var categoryList = new List<T>();
    using (SqlConnection connection = GetDatabaseConnection())
    using (SqlCommand command = new SqlCommand(spName, connection))
    command.CommandType = CommandType.StoredProcedure;
    foreach (var prop in TB.GetType().GetProperties())
    string Key = prop.Name;
    string Value = Convert.ToString(prop.GetValue(TB, null));
    if (!string.IsNullOrEmpty(Value) && Value.Contains(DateTime.MinValue.ToShortDateString()) != true)
    command.Parameters.AddWithValue("@" + Key, prop.GetValue(TB, null));
    SqlDataReader reader = command.ExecuteReader();
    while (reader.Read())
    int i = 0;
    TB = Activator.CreateInstance<T>();
    int colCount = reader.FieldCount;
    foreach (var prop in TB.GetType().GetProperties())
    if (prop.Name != "result" && i <= (colCount - 1))
    prop.SetValue(TB, reader[prop.Name], null);
    i++;
    categoryList.Add(TB);
    return categoryList.ToList();
    Thank you.
    Owner | Software Developer at NULLPLEX SOFTWARE SOLUTIONS http://nullplex.com

  • Issue trying to install Oracle Data Access Software for Windows

    I am trying to install the Data Access software on one of my many workstations and am having an issue with one of them. When the installer comes up it gives me an error:
    Invalid Inventory List and it exits.
    What do I have to do to get it to recognize my oracle home?

    Most probably you have hit this issue:
    "If you have more than one Oracle Home installed on the same machine (e.g. Oracle8i client and Oracle9i Release 2 client), use the Oracle Home Selector to run your applications with Oracle9i Release 2 client. "
    As documented on the Oracle Data Access Software for Windows. Release 9.2.0.4.0
    ~ Madrid.

  • Data Access Manager

    Hi all
    I had to design a data access layer framework (yes, I DID suggest Hibernate at the time, duffy ;-)) and came up with a rather good solution, IMHO. As the data access should be generated by a tool, there's but one possibility : map one DTO per table and populate them in the DAO. That's how it was before I arrived and it won't change.
    Here is how it works :
    public interface DAO {
        // marker interface
    // DAO interface for table 'xxx'
    public interface XxxDAO extends DAO {
        public static final class Row {
            // column mappings to private java fields
            // full constructor
            // public getters & setters
        public Row get(/* primary key */);
        public Set<Row> select(/* index */);
        public int insert(Row row); // returns the serial if any
        public void update(/* primary key */, Row row);
        public void delete(Row row);
    }It is sufficient to explain my problem :
    1. there's no primary key in our tables (eeeeeeeeewwwww, I know, nobody listens to me)
    2. OTOH there may be multiple unique indices
    3. ... and of course many multiple non unique indices
    instead of creating getX(), getY(), ..., selectX(), selectY(), blablabla... methods with as many parameters as the respective index holds, I'd like to be able to do something like :
        public Row get(Key key); // any existing unique index
        public Set<Row> select(Index index); // any existing non unique index
        public int insert(Row row); // returns the serial if any
        public void update(Key key, Row row); // here 'key' would be the "main" unique index (I'll deal with that ;))
        public void delete(Row row);But as soon as I start to think about how to modify the SQL implementation, I'm quite stuck. How could the (SQL)DAO know which key it receives and then build the query in an appropriate way ? I can't give any SQL responsibility to the Key class since it's intended to remain technology-independent, and a factory sounds heavy and dodgy to me...
    Will I have to duplicate these get and select methods and give them unpronouncable names based on their index (indices can hold the same data types in the same order, unfortunately, that's why the Key/Index classes would have helped) ?
    Do I have to design generic Key and Index classes that can be used as DTO for any query constraints (like an array containing the index fields as Objects)... by doing so, I stop enforcing that XxxDAO should only receive XxxKey and XxxIndex...
    This problem is driving me mad, any help is welcome

    I'd like to open by saying whoever suggested you
    reduplicate the work of so many teams of people
    single handedly is a complete and total idiot and you
    should be sure you get paid for every minute of your
    wasted time and futile effort.I sure will... ;-)
    >
    You need to generate SQL for everything. YOU are not
    the one that gets to act like the database is
    abstract, YOU are the one that gets to make it so
    others can act that way..right
    >
    The key can most certainly have SQL. What you are
    creating is your internal DAO, not what the end user
    will think of as his DAO. He does not get to see
    your key or what it really does. that's not how I was seeing things actually. I wanted to create the Key and Index class in order to facilitate the coder's life :
    * unique index => Row get(Key) method
    * non unique index => Set<Row> select(Index) method
    As I want to "abstract myself away" from SQL (the data might come from a Web Service, flat files, or even Hibernate or any other O/R mapping tool), my (top) key (interface/class) shouldn't hold any implementation-specific code
    so like with JDO
    you can create your key based on the metadata of the
    end user. that is how you know the type of key and
    whatnot.
    Why don't you tell that bonehead company to just
    assign you to hibernate or jpox and allow you to work
    there solving the same problem and getting much
    faster results for us all, than this...
    Facisinating.actually, at the time when the decision was made, Hibernate was (slightly) incompatible with older versions of the DBMS that's installed at our customers'
    I bet it's not the case anymore now and we've spent money for nothing, but I'm not the one who gets to make the decisions ;-)

  • Data Access Account as Default RunAs Account in Tasks

    Hi all,
    i have the problem in a brand new SCOM 2012 R2 Installation every Tasks that I execute (like DCDIAG for AD or Site Discovery for SCCM) is being executed with the System Center Data Access Service Account (here "scdac") from SCOM. Also if i define
    another specific RunAs Account like the "scadmin" (Domain Admin) Account. Its very strange because the System Center Data Access Account isn't defined in any way as an Account in the SCOM Console by default. Here are some screenhots:
    Either....
    Or...

    Hi all,
    i have the problem in a brand new SCOM 2012 R2 Installation every Tasks that I execute (like DCDIAG for AD or Site Discovery for SCCM) is being executed with the System Center Data Access Service Account (here "scdac") from SCOM. Also if i define another
    specific RunAs Account like the "scadmin" (Domain Admin) Account. Its very strange because the System Center Data Access Account isn't defined in any way as an Account in the SCOM Console by default. Here are some screenhots:
    Either....
    Or...
    the same result:

  • Problems in using Gplus Data Access Component to invoke SAP-RFCs

    Hello integration-gurus!
    We need to make RFCalls out of routing strategies within Genesys Universal Routing Server (URS).
    We learned that we need to install the Gplus Data Access Component (DAC) in order to do so. After copying the rfc32lib.dll into installation directory of DAC the DAC establishes a valid RFC-connection to ourer SAP-CRM 5.0 system. But if the URS requests the definition of the specific remotefunction (the DAC provides this information as a WSDL for URS) it gets a timeout. Now we think that we need special files (perhaps XML-schemata provided by SAP) to transform the DDIC-types provided by the RFC into SOAP-specific datatypes communicated from DAC to URS.
    Where can we get these files?
    Has anyone experiences with such an integration scenario?
    Kind regards
    Michael

    Hello integration-gurus!
    We need to make RFCalls out of routing strategies within Genesys Universal Routing Server (URS).
    We learned that we need to install the Gplus Data Access Component (DAC) in order to do so. After copying the rfc32lib.dll into installation directory of DAC the DAC establishes a valid RFC-connection to ourer SAP-CRM 5.0 system. But if the URS requests the definition of the specific remotefunction (the DAC provides this information as a WSDL for URS) it gets a timeout. Now we think that we need special files (perhaps XML-schemata provided by SAP) to transform the DDIC-types provided by the RFC into SOAP-specific datatypes communicated from DAC to URS.
    Where can we get these files?
    Has anyone experiences with such an integration scenario?
    Kind regards
    Michael

Maybe you are looking for