Introduction

Living donor liver transplantation (LDLT) in children was introduced in 1989 [1] and by 1992 achieved a 90 % success rate [2]. The rapid growth of the transplant waitlist in the 1990s and the shortage of donor organs stimulated the extension of LDLT to adults (AA-LDLT). Early experience in AA-LDLT was marked by technical complications pertaining to a unique type of early graft dysfunction related to insufficient liver mass termed small-for-size syndrome (SFSS) [35]. By itself, resection of segments 2 and 3 resulted in a perfect graft for a small child, but our early attempts to use these lobes in adults and larger children were met with uniform failure (unpublished).

First performed in Kyoto by Tanaka et al. in 1990 [6], right hepatectomy for the living donor seemed a daunting undertaking with substantial risk to the donor. However, subsequent reports established both the safety in donors and efficacy in recipients of right lobe LDLT [79]. Expectedly, right lobe donation was rapidly embraced by major centers in Asia, North America, and Europe, and, with few exceptions, became the standard approach for AA-LDLT by 2001. The widespread adoption of the right lobe approach for adults was largely based on the issue of SFSS and early graft dysfunction that plagued the preliminary efforts with left lobe grafting. Gradual understanding of the biology of the small graft and numerous technical innovations led to current expectations of a 5-year survival of 83 % for the recipient with AA-LDLT [10].

Much of the mortality seen earlier with AA- LDLT has been attributed to the lack of understanding of the sequelae of smaller graft implantation. Here we will describe the initial experience with early graft dysfunction and the medical management and surgical strategies used to lessen its occurrence and achieve optimal outcomes.

Early Adult-to-Adult Living Donor Liver Transplantation Experience

Success in pediatric LDLT facilitated the push to improve access to transplantation in adult recipients [1]. However, this early foray into AA-LDLT was met with impediments not witnessed in pediatric recipients. In a single center experience comparing 23 AA-LDLT to 22 pediatric LDLTs, the 1-year survival was 91 % for the children and an alarming 65 % for the adults [11].

Clinical Manifestations of Small-for-size Syndrome

In 1996, as we began to extend LDLT to physically larger recipients using left lateral segments or left lobes, poor graft function was observed in 40 % of patients transplanted with smaller grafts [12••, 13]. Consequently, an association was made between function and graft size. This work was greatly facilitated by the work of Urata et al. [14•], who depicted the relationship between body size and liver volume. This characterization made it possible to estimate the expected size of a healthy liver, termed the standard liver volume (SLV). The ratio of the graft weight to the SLV represented the graft fraction. Patients with grafts representing 50 % or less of the expected liver weight were subject to coagulopathy and cholestasis in the peri-operative period, also known as SFSS [12••].

The size of the graft in LDLT is determined by the extent of hepatectomy and the relative size of the donor and recipient. Once it became clear that there was a lower limit to graft size in relation to success of a transplant, it became imperative to identify that limit to guide donor and recipient selection. A team in Kyoto initially led by Tanaka, rapidly accumulated a vast experience with LDLT and introduced an assessment of graft size related to the recipient weight. They defined the graft-recipient-weight ratio (GRWR); this value equals graft size (kg)/recipient weight (kg) ×10. Based on their work, GRWR of 0.8-1.0 was established as a lower limit to prevent SFSS [15•].

Importantly, the clinical hallmarks of SFSS include coagulopathy, cholestasis and ascites production that lead to clinical deterioration and mortality in recipients with small donor grafts [16]. These functional deficiencies and impairments of hepatic blood flow ultimately lead to renal dysfunction, gastrointestinal bleeding and infectious complications resulting in a high, though variable mortality rate.

The Pathological Characteristics of Small-for-size Syndrome

Mechanism of Small-for-size Syndrome

While the underlying mechanism of SFSS remains elusive, the central problem is the inability of a smaller graft to readily adapt to the hostile milieu created by cirrhosis. Replacement of a cirrhotic liver with a healthy whole liver creates a situation in which the graft’s low resistance vascular bed accommodates the amplified cardiac output and increase in portal blood flow with a compensatory decrease in hepatic artery flow [17]. Hadengue et al. [18] demonstrated this phenomenon using Swan-Ganz catheters and indocyanine dye infusion, in which cardiac output and hepatic blood flow was elevated respectively after transplantation suggestive of increased portal vein flow (PVF) and decreased hepatic artery inflow.

The reflexive decrease of hepatic arterial blood flow in the face of increased portal perfusion, termed the hepatic artery buffer response, is an established physiologic principle and has been described in deceased donor whole-liver transplantation [19, 20]. While many theories have abounded as to why this reciprocal flow pattern occurs, adenosine “washout” as described by Lautt [21] has been accepted as the most plausible explanation. In this model, adenosine is constitutively expressed from the hepatic arterioles leading to a relative vasodilatation. In cases of increased PVF, it is purported that dilution of the adenosine causes vasoconstriction in the hepatic arterial vasculature. Porcine models in which the graft volume/standard liver volume (GV/SLV) was 20 %, illustrate this vaso-spasmotic response in LDLT. Fascinatingly, adenosine infusion through the gastroduodenal artery at the time of LDLT helped rescue the grafts from the pathological response of ischemic cholangiopathy and centrolobular necrosis [22]. Moreover, adenosine infusion prompted a 78 % survival at 2 weeks as opposed to a 25 % survival in the untreated group with a GV/SLV of 20 %. Denervation of the hepatic vasculature upon transplantation has also been postulated as a reason for increased PVF. To date, this theory has not been vetted but presupposes autonomic disturbances in denervated graft.

Nevertheless, this increase in portal flow has proved to be damaging to the allograft. The relationship between SFSS and portal hypertension was posited in an early animal study by Ku et al. [23]. Abrogating the portal hypertension using a portocaval shunt (PCS) in dogs with partial grafts, the animals survived for a mean of 5.3 days, whereas the unshunted animals all died of liver failure within a mean of 1.8 days.

Marcos et al. [24] would expand upon this postulated mechanism of portal hypertension experienced by a partial graft as a causative factor of SFSS. In a clinical study of 44 AA-LDLTs, the portal flow dramatically rose upon implantation, suggesting that the reduced vascular endothelial bed for a partial graft was ill-equipped to handle the full and often times robust, venous flow from the splanchnic system. Importantly, this reduction in a receptive vascular bed is often coupled with a hyperkinetic circulatory state in liver disease that is characterized by high cardiac output and increased splanchnic vascular bed flow [25]. Garcia-Valdecasas et al. [26] demonstrated the difference in PVF between cirrhotic and healthy patients by comparing the PVF between donors and recipients, observing a fourfold augmentation in the recipient’s splanchnic flow. Interestingly, this increased PVF seems to return to normal after 3 months, suggesting an increased accommodation with growth of the graft and a resultant enlarged intrahepatic vasculature volume [27].

Pathological Consequences of Small-for-size Syndrome Underscore the Effects of Portal Hypertension on Smaller Grafts

While many observations regarding SFSS are limited to the clinical level, Kelly et al. [22] provided a translational approach to the understanding of this early graft dysfunction. Using a porcine model for SFSS, transplanted livers with a 20 % GV/SLV were histologically assessed after 5 days. The increased splanchnic flow seen in SFSS resulted in severe portal microvascular injury and peri-portal sinusoidal congestion and frank rupture. Interestingly, the severity of microscopic damage was inversely proportional to the size of the graft and could be seen as early as 1 h after reperfusion. In our clinical report, we were struck by the paradoxical findings of “ischemic” injury in the early biopsy, followed by a dense cholestasis in later biopsies, with eventual resolution to normal histology [14•]. We considered the possibility that the endothelial damage created by excess flow might result in ischemic necrosis of parenchyma served by the disrupted vascular beds.

A subsequent report by Man et al. [28] demonstrated similar findings in a non-arterialized rat model of orthotopic liver transplantation. As soon as 30 min after reperfusion, light microscopy of the livers demonstrated portal congestion, cytoplasmic vacuolar changes and collapse of the space of Disse. Evolution of injury secondary to portal hyper-perfusion also resulted in submassive necrosis in prolonged states of graft dysfunction in SFSS [29].

Man et al. [30] presented studies in AA-LDLT in which the livers were biopsied before and after reperfusion. Electron microscopy of tissue from transplanted livers with a GV/SLV <40 % demonstrated mitochondria swelling and irregular fenestrations in the sinusoidal endothelium. These findings were also coupled with a decreased slope of normalization of PVF after reperfusion. Notably, this unrelenting heightened PVF reciprocally decreases the arterial flow through the buffer response, resulting in vasospasm, ischemia and infarction [31].

In addition to disturbances caused by vascular trauma and over-perfusion, the liver receiving an excess of portal flow is subjected to nutrient excess, leading to oxidative stress and further tissue damage through both redox pathways and the incitement of the pro-inflammatory cascade [32]. Besides generic stressors, the portal blood carries levels of endotoxin tenfold higher than the systemic circulation [33]. The intact normal liver, with its system of fixed macrophages and dendritic cells, modulates responses to these pro-inflammatory signals. When the normal limits of homeostasis are overwhelmed, cytokine activation occurs, leading to secondary liver injury [33, 34]. Thus, the triumvirate of stimuli that lead to hepatic regeneration, nutrient delivery, increased exposure to endotoxin, and portal overflow, are the same elements that cause graft failure when delivered in excess.

Donor and Recipient Variables Associated with an Increased Risk of Graft Dysfunction

Donor Considerations Other than Size that Contribute to Early Graft Dysfunction

Age may also play a role in post-transplant survival in AA-LDLT. Kiuchi et al. [16] demonstrated that a survival decrement is particularly pronounced if the GRWR <0.8. Comparing 20 donors older than 50 years old and 140 donors younger than 50 years old with a mean GV/SLV of approximately 40 %, the younger donors had lower bilirubin levels and ascites production [35]. Multivariate logistical regression revealed findings along the same premise, in which older donors were a risk for generation of SFSS [36]. Moreover, the mean age of the donors resulting in SFSS was 43.1 years old (GRWR 0.827) compared with the mean donor age of 34.4 years old (GRWR 0.833) for those that did not contribute to early graft dysfunction.

Recipient Factors Involved in Graft Dysfunction

While graft size has been implicated as the principal cause of graft dysfunction in AA-LDLT, this belief may obscure the importance of other variables that may also carry weight. Ben-Haim et al. [37] established that patients with severe decompensation require larger grafts. For example, patients with Child’s Turcotte Pugh (CTP) class B or C required a GRWR in excess of 0.85 to prevent SFSS. Soejima et al. [38] showed that SFSS after AA-LDLT was experienced at a disproportionally higher rate of 43.8 % in cirrhotic patients compared with 5 % in non-cirrhotics. Moreover, a GV/SLV <45 % in cirrhotic patients was associated with SFSS, whereas in non-cirrhotic patients the threshold of 30 % could be approached before concerns of SFSS arose. By taking these variables into account, the MELD (Model End-Stage Liver Disease) score, an assessment of the recipient’s severity of disease and predictor of 3-month mortality, has been shown to be predictive of SFSS in AA-LDLT [39].

Outflow Obstruction

Venous outflow impairment may also contribute to functional insufficiency in AA-LDLT. Additionally, this impairment might be progressive if the resultant venous congestion further increases the portal pressure to other territories in the small graft. The venous outflow in the right lobe is not always centralized into the right hepatic vein (RHV) [40]. Since the middle hepatic vein (MHV) often drains the territories of segments 5 and 8, venous outflow assurance of the right lobe graft has focused on the MHV drainage field. The anterior portion of the right lobe is particularly susceptible to venous congestion after transection of the feeding vessels into the MHV. While some have supported the inclusion of the MHV in right lobe donation, this may increase risk to the donor by decreasing the residual liver volume [41].

In A2ALL (the Adult to Adult Living Donor Liver Transplantation Cohort Study), SFSS was a very rare finding in US centers using right lobe grafts without additional 5 and 8 drainage. Preservation of the anterior segmental venous tributaries has also been advocated in right lobe donation. Several authors have described surgical techniques to facilitate venous drainage of the anterior segment of the right lobe. Miller et al. [42] reported ten cases of vasculature reconstruction with the use of conduits from segments 5 and 8 to the MHV or inferior vena cava (IVC). Cattral et al. [43] also reported using an autologous jump graft from the distal segment of the MHV in the donated right lobe to the recipient IVC in AA-LDLT. However, venous reconstruction of the anteromedial segment has not been uniformly accepted, especially in larger grafts. In right lobe grafts with a mean GRWR of 1.35, Detry et al. [44] purported that the theoretical risk of venous congestion in the anteromedial segment in AA-LDLT without reestablishing venous flow would not result in significant graft dysfunction in these larger grafts.

Hepatic vein outflow optimization has also been described for left lobe AA-LDLT to minimize the chance of SFSS. In five patients, Oya et al. [45] described performing an end-to-side anastomosis between the IVC and plastied MHV and left hepatic vein (LHV) longitudinally in a quarter-counterclockwise position. In comparing these five patients with nine other patients that had a more standard end-to-end hepatic vein anastomosis, there was a more precipitous decline in bilirubin levels, suggesting that hepatic function was improved with this newly described technique.

Notwithstanding the importance of decompression of the anteromedial segment in right lobe donation and plasty of MHV and LHV in left lobe donation, optimization of the caval outflow and IVC wall cavoplasty may be the most imperative. Described by Emond et al. [46] in the pediatric population, a large triangular shaped cavoplasty was made in the IVC to ensure the widest outflow possible. Posterior cavoplasty was also described in right lobe AA-LDLT by extension of the RHV orifice laterally down the IVC [40]. Most recently, Goralczyk et al. [47] established that portal venous pressures (PVP) were lessened by this outflow strategy, which was associated with a sustained decrease in bilirubin levels compared with the non-plasty controls.

Preventive Strategies that Have Arisen to Mitigate Small-for-Syndrome

Inflow Modulation May Alleviate the Portal Hypertension Experienced by the Graft

Troisi et al. [48] were amongst the first to demonstrate reduced portal hyper-perfusion with splenic artery ligation (SAL). Comparing, AA-LDLT with and without inflow modulation, two out of seven patients without inflow modulation developed SFSS, compared with ten AA-LDLT patients that underwent SAL and did not suffer the sequelae of early graft dysfunction. Undoubtedly, SAL raises the flow in the common hepatic artery, which can counter the imbalance of increased PVPs seen in smaller grafts. Early studies by Marcos et al. [24] described an interrelationship between the hepatic artery and PVF in AA-LDLT. As described earlier, an increase in the PVF was met with a converse reduction in hepatic artery flow.

Interestingly, Ito et al. [49] would demonstrate that 13 patients who underwent AA-LDLT with a GRWR <0.8 and had PVP >20 mmHg in the first week had significantly worse 6-month survival than their controls (84.5 % versus 38.5 %). However, a subset of seven patients that had a GRWR <1.0 or PVP >20 mmHg that underwent SAL had restoration of greater that 80 % survival. Troisi et al. [50] also demonstrated that SAL in a small subset of AA-LDLT patients had 1-year survival of 93 % compared with the unmodified group with a survival of 62 %. Admittedly, while the cohort in this study was small, a striking 27 % of recipients in the unmodified group developed SFSS, while SAL entirely prevented its manifestations.

The relatively recent application of the interventional radiology techniques for inflow modulation has provided an alternative to SAL. One of the first descriptions of splenic artery embolization (SAE) in AA-LDLT was performed to reduce portal flow after venous congestion and SFSS resulting from thrombosed conduits from segments 5 and 8 to the IVC [51]. Importantly, after SAE the patient’s prolonged cholestasis, coagulopathy and ascites resolved, and the post-operative course was favorable. Umeda et al. [52] affirmed these findings and showed that in 30 patients that underwent SAE before AA-LDLT, there was a significant reduction in portal venous velocity with an opposing hepatic artery flow augmentation compared with the control group.

Perhaps the most favorable aspect of SAE is that it can empower clinicians to modulate portal flow after the operation. Furthermore, surgeons do not have to commit to inflow modulation in a relatively constrained time period. Thus, the progression of portal flow and the appearance of possible SFSS symptoms can be evaluated and managed in an unhurried setting. Gurttadauria et al. [53] described performing SAE in six patients that developed SFSS in the first week after AA-LDLT. Biopsies of the patients with SFSS were marked by obvious sinusoidal dilatation and hepatocyte atrophy. Importantly, resolution of the symptomatology quickly followed with SAE.

Splenectomy has also been reported in the management of inflow for partial grafts. However, the role of splenectomy in AA-LDLT has yet to be fully defined and is not yet widely accepted given the perceived risks of infection. Nevertheless, in univariate logistic regression analysis Yoshizumi et al. [54] showed that in AA-LDLT patients with a GV/SLV <40 %, 22 unsplenectomized patients had a hazard ratio of 9.01 in predicting SFSS. Additionally, splenectomy may avert complications associated with SAL/SAE, including splenic infarction and abscess formation [55].

Portocaval Shunting Can Be Used as a Method to Reduce Portal Pressures for Smaller Grafts

As the augmented PVF experienced by smaller grafts became an accepted rationale for SFSS, partial diversion of this portal flow was a natural extension of efforts to improve AA-LDLT. Initial animal experiments described a dramatic benefit in performing a PCS in partial graft transplantation. Smyrniotis et al. [56] created mesocaval shunts after transplantation of eight porcine grafts with a GV/SLV <20 % with 100 % survival at 48 h compared with a 65 % mortality rate in unshunted transplants. Takada et al. [57] were amongst the first to devise a novel PCS in AA-LDLT, as described in two patients with a GRWR of 0.55 and 0.7 and high PVFs. Interestingly, an end-to-side anastomosis of the recipient right portal vein was created between the IVC and the donor portal vein and sewn to the recipient left portal vein branch. A variation of this technique was employed by Masetti et al. [58], in which an interposition graft (saphenous vein) was anastomosed between the recipient right portal vein and RHV in AA-LDLT; the recipient donor left portal vein was anastomosed to the donor portal vein. Providing proof of principle, using a left lobe graft with a GV/SLV of only 20 %, this PCS was employed with a splenectomy to lower the heightened PVP after reperfusion.

Expanding the cohort for these techniques, Troisi et al. [59] demonstrated that performing a PCS in 8 AA-LDLT patients with a mean GRWR of 0.68 compared with five non-shunted patients with a GRWR of 0.75, not only reduced the PVF but also dramatically improved the 1-year survival (87.5 % vs 40 %). While the beneficial effects of portocaval shunting in AA-LDLT are well established, the adverse consequences of diversion of portal flow should also be recognized.

Oura et al. [60] reported a case report in a patient that underwent a PCS due to elevated PVPs and a GV/SLV 40.7 %. Eleven months after the AA-LDLT, the patient had reduction of the peak GV/SLV of 74.3 % at 1-month post transplant to a GV/SLV of 32 % with concomitant coagulopathy and hyperammonemia. Axial imaging demonstrated a persistent PCS that was ultimately operatively closed, resulting in excellent graft function and abrogation of the hepatic atrophy. A case report by Botha et al. [61] also illustrated a portocaval steal phenomenon in a patient with a PCS after an AA-LDLT, resulting in hepatic encephalopathy. Deploying a covered endograft in the IVC over the orifice of the shunt ultimately closed this PCS. However, the prevalence of persistent shunts is somewhat unclear. Analysis of ten patients with PCSs showed that the 1-year patency was only 20 % in patients with a mean GRWR of 0.6 [62]. Interestingly, to ensure closure of these shunts, Sato et al. [63] devised a novel strategy of using the round ligament as an interposition graft for a PCS. Given that the round ligament is a vestige of the umbilical vein, it represents an abnormal vein that is subject to a natural banding effect over time. As such, this case series of four patients reported 100 % auto-closure of the interposition PCS using the round ligament by 6 months.

Not unexpectedly, the advent of minimally invasive therapeutic modalities to treat liver disease has encouraged their possible utility in AA-LDLT. Similar to SAE which affords the clinician a wider temporal breath to assess the functionality of the new graft before committing to inflow modulation, techniques like transjugular intrahepatic portosystemic shunt (TIPS) may also find a place in management to reduce the risk of SFSS. A recent case report by Xiao et al. [64] illustrates the use of TIPS 35 days after AA-LDLT to nullify the symptoms of SFSS by reducing the PVP from 28.7 mmHg to 15 mmHg. However, its acceptance as a possible treatment for SFSS is relegated to case reports, since the long term consequences of this portal vein diversion have not been vetted.

Peri-Operative Infusion of Vasoactive Agents to Control Portal Pressure

Vasopressin, a vasopressor widely used in critically ill patients, has been useful in liver transplantation due to its effect in lowering portal blood flow [65]. Portal decompression through portal venous infusion of vasoactive agents has also arisen as a means to guard against SFSS in AA-LDLT. In animal and clinical studies, various agents with vasodilator and fibrinolytic properties, such as prostaglandin E1, thromboxane A2 synthetase inhibitor and nafamostat mesilate, have been shown to alleviate hepatic injury through alterations in the hemodynamics of splanchnic venous system [6668]. As such, a large clinical study by Suerhiro et al. [69] assessed the affect of these medications on patients that underwent AA-LDLT with a GV/SLV <50 %. Intraportal infusion of a cocktail including prostaglandin E1, thromboxane A2 synthetase inhibitor and nafamostat mesilate for 7 days after reperfusion in 53 patients decreased the SFSS rate to 3.8 % compared with 25.4 % in a control group of 59 patients. Continuous intraportal infusion of prostaglandin E1 only for 1 week after AA-LDLT with recipients with a mean GRWR 0.68 resulted in decreased PVP and significantly better 2-year survival [70].

The use of other agents infused in portal system has also been explored in AA-LDLT to counterbalance the heightened flow dynamics. In a case series by Busani et al. [71], octreotide and esmolol infusion through a jejunal vein for 48 h in three patients with a mean GRWR of 0.6 reduced the hepatic venous pressure gradient from 14.6 mmHg to 9.25 mmHg. A splanchnic vasoconstrictor, intravenous octreotide, used in concert with oral propranolol was shown to ameliorate the coagulopathy and hyperbilirubinemia seen in an AA-LDLT patient that developed SFSS 2 days after transplantation [72].

Partial Grafts are Characterized by Early Changes in the Hepatic Milieu that may Precipitate or Abrogate Small-for-size Syndrome

Regeneration of the Partial Graft May Be the Single Most Import Entity in Preventing Small-for-size Syndrome

While the exact mechanisms that govern regeneration of partial grafts in AA-LDLT remain controversial, several associative and anecdotal phenomenons have been noted to coincide with accelerated growth to the SLV. Hypothetically, this robust early regeneration may ameliorate the symptoms of SFSS as GRWR and GV/SLV is improved. As such, these published accounts lead to the conclusion that there are probably multiple factors involved it liver expansion in the peri-operative period.

Initial observations by Ikegami et al. [73] that donors <30 years old experience an earlier regeneration of 80 % of the SLV compared with older donors, were amongst the first reports to define a difference in regeneration rates in AA-LDLT. It is likely that this unfettered regeneration in younger donors is due to the vigorous and unimpaired hepatocellular mechanisms involved in the biochemistry of hepatic biosynthesis. Moreover, the theoretical elasticity of the vasculature in younger donors may protect smaller grafts from the damaging effects of portal hypertension.

Interestingly, Umeda et al. [52] demonstrated that inflow modulation with either SAL or SAE improved regeneration rates of grafts with GRWR <0.8. While inflow modulation is established as a technique to lower PVF, these findings add to the confounding picture of regeneration in AA-LDLT. Numerous animal studies have purported that sheer stress through increased portal velocities induce the regenerative effect witnessed in livers [74, 75]. Eguchi et al. [76] would also show in a small cohort of AA-LDLT patients that a hyperdynamic portal venous system resulted in accelerated near re-establishment of the 100 % SLV at 2 weeks following transplantation. However, these discrepancies may make more sense if the process of graft hypertrophy alone is not strictly correlated with “healthy” regeneration.

In a striking study by Yagi et al. [77], grafts that experienced PVPs >20 mmHg in the first 3 days after AA-LDLT showed increased GV/SLV, but poorer outcomes associated with hyperbilirubinemia, coagulopathy and persistent ascites. PVPs >20 mmHg in these partial grafts resulted in higher hepatocyte growth factor (HGF) levels, but lower vascular endothelial growth factor (VEGF) levels as compared to pressures <20 mmHg. Interestingly, the trophic response to HGF with elevated splanchnic flow may have resulted in accelerated graft growth, but the discontinuity with angiogenesis suggests an abnormal regenerative pattern. Thymidine kinase (TK) a marker for liver regeneration, also presents an interesting pattern in grafts with a GV/SLV <20 %. In the porcine survivors of partial graft transplantation, TK rose gradually after reperfusion, whereas in those that died of fulminant liver failure, TK spiked dramatically and remained twice that of the former group [78].

In 70 % hepatectomized rats, accelerated regeneration lead to lobular disarray, while inhibition of cellular division through the mitogen-activated protein kinase (MEK)/extracellular signal regulated kinase (ERK) signaling pathway preserved the hepatic architecture [79]. Following liver resection, the formation of hepatic islands of 10-12 hepatocytes surrounded by a rudimentary sinusoidal endothelial cell complex occurs rapidly [80]. Theoretically these hepatic islands represent less optimized functional hepatic units due to hurried hepatocyte proliferation in discontinuity with vascular ingrowth. Using NS-398 or PD98059, inhibitors of the MEK/ERK signaling pathway, normal cellular patterns can be reestablished so as to better survival.

Predictive Accuracy of the Current Tools to Prevent Small-for-size Syndrome

Volumetric and Weight Based Assessments are Rudimentary Methods to Guide Clinical Practice as It Relates to Living Donor Liver Transplantation

While simplicity of use and interpretation have made the metrics GRWR and GV/SLV the most oft utilized tools for decisions regarding graft selection for AA-LDLT, it is unclear if they provide an accurate probability of graft dysfunction, specifically SFSS. In analyzing 107 live donors, Hill et al. [81] demonstrated that there was no association between GRWR and the incidence of SFSS, as 13.6 % of the recipients with GRWR <0.8 developed SFSS, as compared to 9.4 % of recipients with a GRWR ≥0.8 (p = nonsignificant). Obviously, while these results counter previously presented reports and the statistical validity of a single institution study is not beyond reproach, these findings do raise concerns of a strict reliance of volumetric and weight-based assessments. Our recent report from the current A2ALL data demonstrated that size was not a predictive variable in the occurrence of graft dysfunction (Pomposelli, AASLD 2013).

The notion of the importance of GV/SLV was challenged even further when comparisons between AA-LDLT patients with either GV/SLV <35 % or ≥35 % showed no significant differences in rates of SFSS [82]. While the mean GV/SLV was 31.8 % and 42.5 % in the respective groups, the 1- and 5-year survival was also similar. Lastly, this report emphasizes the exclusive use of the left lobe donation and the lack of difference in outcomes between the two groups, suggesting that left lobe AA-LDLT may be the “procedure of choice” given the lesser morbidity incurred by the donor.

The Reemergence of Left Lobe Living Liver Donor Liver Transplantation

Despite the earlier work by Kawasaki et al. [83] that demonstrated no graft dysfunction in left lobe AA-LDLT with a GV/SLV approaching 36 %, the surgical community has proceeded cautiously with left lobe procurement. In 2006, Soejima et al. [84] published an 8-year accruement of left lobe AA-LLDT. Interestingly, in 107 grafts with a mean GRWR 0.8, there was a 25.2 % incidence of intractable ascites production and cholestasis in the recipient. These 27 patients that developed SFSS where found to have a mean GV/SLV of 36 %. However, in the first U.S. series published on AA-LDLT with left lobe grafts, only one patient out of 16 patients who also underwent concomitant portocaval shunting developed SFSS [85]. Moreover, with a mean GV/SLV of 28.5 % the actuarial 1-year patient survival was 87 %. Of note, two patients were encephalopathic for greater that 2 months and required endovascular occlusion. In another recent report, Ishizaki et al. [86] published their account of 42 consecutive left lobe AA-LDLT recipients with a mean GV/SLV of 39.8 %, in which none of their patients developed SFSS despite the lack of inflow modulation or portocaval shunting. Survival was excellent in this patient cohort with 1, 3 and 5-year survival 100 %, 97 % and 91 % respectively. These results may point to a re-embrace of left lobe donation as the ability to predict and guard against SFSS has improved.

Conclusion

Given the noticeable decline of deceased donor donation and the steady demand for donor organs, clinicians have advocated living donation as a way to meet this shortfall. However, there have been numerous concerns regarding the applicability of LDLT in a meaningful way to address the steady death rate on the transplant waitlist. Some cite the risk of death from living donation, which has been estimated to be 0.17-0.28 %, as a legitimate shortcoming [87].

However, it is most likely that donor morbidity from hepatectomy, which is estimated to be 30-50 %, is of most concern. In general, right lobe donation confers a higher risk than left lobe donation; and left lobe donation carries a higher risk than left lateral segment donation [88, 89]. While it is obvious that surgeons would like to minimize morbidity in the donor, the donor graft size must be appropriate to decrease the risk of recipient SFSS. Meanwhile, the definition of the safe parameters for graft hemodynamics remains uncertain and influenced by many parameters other than size [48, 62, 90, 91]. As such, the transplant community is left with the conundrum that although larger grafts are better for the recipients, the increased hepatectomy adds to donor risk.

Novel strategies have also been introduced to offset donor risk, but they have largely failed to gain acceptance because of their complexity or impracticality. For example, while Lee and others have used dual grafts to provide more liver mass to increase GRWR, this approach does not seem feasible on a large scale [92, 93]. Other biological interventions in the donors to increased liver size may not be safe [94].

Wider implementation of AA-LDLT as a sustainable means to expand the donor pool is at a crucial juncture. Notably, the ability to gain further acceptance of AA-LDLT relies on off-setting both donor and recipient risks [95]. The variability uncovered in both pathophysiology and risk factors underscores how little is still known regarding the mechanism and methods to prevent its occurrence. Continued laboratory, clinical, and scientific pursuit is necessary to refine the use of smaller grafts and make living liver donation even more acceptable and applicable.