When working from the Windows command line, you can do a quick test to validate your SMTP connectivity using PowerShell:
c:\> Powershell -executionpolicy bypass
PS c:\> Send-MailMessage –to <TO> –from <FROM> –subject "testing123" –body "this is a test" –smtpserver <SMTPServer> -port 25
And if the mail server is accessed over TLS/SSL with SMTP authentication enabled:
PS c:\> Send-MailMessage –to <TO> –from <FROM> –subject "testing456" –body "this is a secure test" –smtpserver <SMTPServer> -port 587 -UseSsl -Credential (Get-Credential)
This is easier than going down to telnet, which is typically not installed on a modern Windows host: Continue reading “Sending SMTP Mail from Windows Using PowerShell”
The most common way of integrating your existing Identity Management system with Documentum is to offer SSO (Single Sign-On) via the LDAP Synchronization job.
This requires that you set a Base DN for Documentum to search through, but it is not uncommon when dealing with real-world LDAP servers to have LDAP referrals in that search space. This is transparent, but it can cause performance issues, and even cause the job to timeout if the forwarded DNS name is not resolvable from the Content Server host.
Continue reading “Documentum: Ignoring Referrals from the LDAP Synch Job”
Identity Management for On-Premise Applications
Our industry today has some very proven technologies for providing a single set of login credentials to applications installed on-premise. Most commonly, companies use a central Identity Management system (e.g. Microsoft Active Directory/Oracle Internet Directory/IBM Tivoli), and these systems implement an LDAP interface that 3rd party applications can call to validate user credentials.
This allows end users to login to their internal HR portal, SharePoint site, or local Documentum Webtop with the same credentials they used to gain entrance into their Windows Desktop, and is termed SSO (Single Sign-On). This has dramatically improved the end user experience, as well as improved the ability of IT to mange the risk and policies surrounding identity management.
Continue reading “EMC OnDemand: Federated Identity Management and Silent SSO”
The concept of custom methods which run directly on the Java Method Server has proven an extremely useful extension point for Documentum developers and solutions architects. Whether used in a workflow activity to integrate with an enterprise message queue or as an action for Webtop users who need temporarily escalated privileges to apply legal retention, custom Java methods have become a key customization in most customer environments. Features include:
- Lightweight invocation of methods as compared to dmbasic and external Java methods that require execution
- DFC operations execute on the same host as the Content Server which minimizes the effects of network latency and throughput
- Can be configured to run as the repository owner which allows them elevated privileges to content when necessary
- Provide the logic for workflow auto-activities, able to utilize any Java library including the DFC
- Provide the logic for custom job/methods, again able to utilize the full power of Java and its libraries
Continue reading “EMC OnDemand: Best Practices for Custom Methods”
Content delivery is one of the primary use cases for a Content Mangement system. When users are spread across six different continents, you must have an implementation that ensures timely access for all users – not just those in the local network. A typical scenario involves the database and primary Content Server deployed in the main North American or European datacenter with remote user groups scattered throughout the world. These remote offices often have limited network throughput, which makes it even more challenging.
Enter Branch Office Caching Services
Documentum has dealt with this scenario since its inception and has a myriad of options for streamlining delivery to users in geographically distributed locations or different departments, among them: remote content servers with distributed storage areas, federations with replication, and Branch Office Caching Services (BOCS). When we, as OnDemand Architects, looked at our customer needs and use cases, it became apparent that BOCS would be instrumental in providing remote users the experience they expected – which essentially boils down to application and content access on par with a local deployment.
Working with our customers in the real world, we have seen that web application access for remote users (whether via Webtop, D2, or xCP 2.0) is not signficantly impaired by the incremental increase in latency to return HTML/JS/CSS. The primary factor in application response and users’ perception of performance was the time it takes to transfer content during import, export, and checkin/checkout operations.
Continue reading “EMC OnDemand: Enabling Distributed Content Features and BOCS”
As you can imagine, potential customers have a lot of very legitimate questions when considering the move to EMC OnDemand. For both new customers as well as those who are migrating their existing content into the EMC secure private cloud one of the questions we hear a lot is, “Why would I choose EMC OnDemand instead of Amazon EC2?”.
I love this question. It gives us a chance to talk about all the EMC OnDemand value-add without the appearance of grandstanding. And in the end, it is clear to everyone this is an apples to oranges question, but the explanation allows us to highlight some key points that resonate very deeply with an EMC customer evaluating cloud offerings.
Continue reading “EMC OnDemand: OnDemand versus Amazon EC2”
The Documentum Foundation Services (DFS) introduced developers to the ‘DFS Data Model’, a rich object model that is capable of representing complex repository objects and relationships during interactions with content services. For those with a DFC programming background, it can be a challenge to shift into the DFS paradigm which focuses on service oriented calls and relies on the data model to fully describe the requested transformations.
Based on my contact with customers through formal Service Requests as well as the EMC Support Forums, I see that many architects, when presented with this unfamiliar landscape instantly assume that the best course of action is to design a custom model to shield other developers from the perceived complexity of the DFS data model. Although well intentioned, I believe this initial reaction to change can have serious implications that are not often considered or understood at the time of their implementation.
While I believe that abstracting the construction of the DFS data model carries a great deal of value, I believe that replacing the DFS data model with a custom model should be done only with deliberate purpose and awareness. I will use this article to explore the motivations behind the development of these “simplified” models, their ramifications in a long-term SOA strategy, and how you can deliver convenience without making integration unnecessarily difficult or hindering the building-block nature of SOA.
Continue reading “Documentum: Enterprise use of the DFS Data Model”